Oct 09 19:28:33 crc systemd[1]: Starting Kubernetes Kubelet... Oct 09 19:28:33 crc restorecon[4670]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:33 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 19:28:34 crc restorecon[4670]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 09 19:28:34 crc kubenswrapper[4907]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 19:28:34 crc kubenswrapper[4907]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 09 19:28:34 crc kubenswrapper[4907]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 19:28:34 crc kubenswrapper[4907]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 19:28:34 crc kubenswrapper[4907]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 09 19:28:34 crc kubenswrapper[4907]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.884852 4907 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890763 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890787 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890793 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890798 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890811 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890817 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890824 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890829 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890835 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890840 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890845 4907 feature_gate.go:330] unrecognized feature gate: Example Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890851 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890856 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890861 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890865 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890870 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890876 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890889 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890896 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890902 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890907 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890912 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890917 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890922 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890926 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890935 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890940 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890947 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890953 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890967 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890973 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890980 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890986 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890990 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.890995 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891000 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891007 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891011 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891015 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891022 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891032 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891036 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891041 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891046 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891050 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891055 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891058 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891062 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891066 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891070 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891074 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891078 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891082 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891089 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891093 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891096 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891100 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891104 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891107 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891112 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891118 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891125 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891130 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891134 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891138 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891144 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891147 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891151 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891155 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891161 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.891165 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892764 4907 flags.go:64] FLAG: --address="0.0.0.0" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892821 4907 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892837 4907 flags.go:64] FLAG: --anonymous-auth="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892846 4907 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892857 4907 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892863 4907 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892872 4907 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892880 4907 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892885 4907 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892890 4907 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892897 4907 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892902 4907 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892908 4907 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892913 4907 flags.go:64] FLAG: --cgroup-root="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892917 4907 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892922 4907 flags.go:64] FLAG: --client-ca-file="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892927 4907 flags.go:64] FLAG: --cloud-config="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892931 4907 flags.go:64] FLAG: --cloud-provider="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892935 4907 flags.go:64] FLAG: --cluster-dns="[]" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892944 4907 flags.go:64] FLAG: --cluster-domain="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892949 4907 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892953 4907 flags.go:64] FLAG: --config-dir="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892958 4907 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892963 4907 flags.go:64] FLAG: --container-log-max-files="5" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892970 4907 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892975 4907 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892979 4907 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892984 4907 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892988 4907 flags.go:64] FLAG: --contention-profiling="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892992 4907 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.892997 4907 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893002 4907 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893007 4907 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893016 4907 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893021 4907 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893025 4907 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893029 4907 flags.go:64] FLAG: --enable-load-reader="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893034 4907 flags.go:64] FLAG: --enable-server="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893038 4907 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893045 4907 flags.go:64] FLAG: --event-burst="100" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893049 4907 flags.go:64] FLAG: --event-qps="50" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893054 4907 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893059 4907 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893063 4907 flags.go:64] FLAG: --eviction-hard="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893069 4907 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893073 4907 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893078 4907 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893082 4907 flags.go:64] FLAG: --eviction-soft="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893086 4907 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893090 4907 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893094 4907 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893098 4907 flags.go:64] FLAG: --experimental-mounter-path="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893102 4907 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893106 4907 flags.go:64] FLAG: --fail-swap-on="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893111 4907 flags.go:64] FLAG: --feature-gates="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893116 4907 flags.go:64] FLAG: --file-check-frequency="20s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893120 4907 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893128 4907 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893133 4907 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893138 4907 flags.go:64] FLAG: --healthz-port="10248" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893143 4907 flags.go:64] FLAG: --help="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893148 4907 flags.go:64] FLAG: --hostname-override="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893152 4907 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893158 4907 flags.go:64] FLAG: --http-check-frequency="20s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893162 4907 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893166 4907 flags.go:64] FLAG: --image-credential-provider-config="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893170 4907 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893175 4907 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893179 4907 flags.go:64] FLAG: --image-service-endpoint="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893183 4907 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893187 4907 flags.go:64] FLAG: --kube-api-burst="100" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893191 4907 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893196 4907 flags.go:64] FLAG: --kube-api-qps="50" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893200 4907 flags.go:64] FLAG: --kube-reserved="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893204 4907 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893209 4907 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893213 4907 flags.go:64] FLAG: --kubelet-cgroups="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893217 4907 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893221 4907 flags.go:64] FLAG: --lock-file="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893227 4907 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893231 4907 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893237 4907 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893246 4907 flags.go:64] FLAG: --log-json-split-stream="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893252 4907 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893256 4907 flags.go:64] FLAG: --log-text-split-stream="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893261 4907 flags.go:64] FLAG: --logging-format="text" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893266 4907 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893271 4907 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893275 4907 flags.go:64] FLAG: --manifest-url="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893280 4907 flags.go:64] FLAG: --manifest-url-header="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893288 4907 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893293 4907 flags.go:64] FLAG: --max-open-files="1000000" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893299 4907 flags.go:64] FLAG: --max-pods="110" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893304 4907 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893308 4907 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893313 4907 flags.go:64] FLAG: --memory-manager-policy="None" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893317 4907 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893322 4907 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893326 4907 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893354 4907 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893377 4907 flags.go:64] FLAG: --node-status-max-images="50" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893382 4907 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893386 4907 flags.go:64] FLAG: --oom-score-adj="-999" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893391 4907 flags.go:64] FLAG: --pod-cidr="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893395 4907 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893406 4907 flags.go:64] FLAG: --pod-manifest-path="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893410 4907 flags.go:64] FLAG: --pod-max-pids="-1" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893416 4907 flags.go:64] FLAG: --pods-per-core="0" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893421 4907 flags.go:64] FLAG: --port="10250" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893425 4907 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893430 4907 flags.go:64] FLAG: --provider-id="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893434 4907 flags.go:64] FLAG: --qos-reserved="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893438 4907 flags.go:64] FLAG: --read-only-port="10255" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893442 4907 flags.go:64] FLAG: --register-node="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893447 4907 flags.go:64] FLAG: --register-schedulable="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893451 4907 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893482 4907 flags.go:64] FLAG: --registry-burst="10" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893486 4907 flags.go:64] FLAG: --registry-qps="5" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893490 4907 flags.go:64] FLAG: --reserved-cpus="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893496 4907 flags.go:64] FLAG: --reserved-memory="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893502 4907 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893507 4907 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893511 4907 flags.go:64] FLAG: --rotate-certificates="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893515 4907 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893519 4907 flags.go:64] FLAG: --runonce="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893524 4907 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893528 4907 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893532 4907 flags.go:64] FLAG: --seccomp-default="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893538 4907 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893543 4907 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893548 4907 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893552 4907 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893557 4907 flags.go:64] FLAG: --storage-driver-password="root" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893561 4907 flags.go:64] FLAG: --storage-driver-secure="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893565 4907 flags.go:64] FLAG: --storage-driver-table="stats" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893577 4907 flags.go:64] FLAG: --storage-driver-user="root" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893581 4907 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893586 4907 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893591 4907 flags.go:64] FLAG: --system-cgroups="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893595 4907 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893604 4907 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893608 4907 flags.go:64] FLAG: --tls-cert-file="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893613 4907 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893619 4907 flags.go:64] FLAG: --tls-min-version="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893624 4907 flags.go:64] FLAG: --tls-private-key-file="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893628 4907 flags.go:64] FLAG: --topology-manager-policy="none" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893632 4907 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893637 4907 flags.go:64] FLAG: --topology-manager-scope="container" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893641 4907 flags.go:64] FLAG: --v="2" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893651 4907 flags.go:64] FLAG: --version="false" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893658 4907 flags.go:64] FLAG: --vmodule="" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893665 4907 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.893670 4907 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893869 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893876 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893883 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893887 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893891 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893895 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893899 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893904 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893908 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893911 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893916 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893920 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893923 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893927 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893935 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893942 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893950 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893955 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893960 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893964 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893968 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893973 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893977 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893982 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893986 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893991 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.893995 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894000 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894006 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894014 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894021 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894026 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894030 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894035 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894039 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894044 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894048 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894052 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894058 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894064 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894069 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894074 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894078 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894083 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894088 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894092 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894101 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894106 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894110 4907 feature_gate.go:330] unrecognized feature gate: Example Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894115 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894120 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894124 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894130 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894135 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894140 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894145 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894150 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894155 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894159 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894165 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894171 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894175 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894180 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894184 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894188 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894193 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894197 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894202 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894206 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894210 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.894214 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.894225 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.908655 4907 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.908695 4907 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908779 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908790 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908796 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908801 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908807 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908812 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908818 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908823 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908828 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908833 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908838 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908842 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908847 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908854 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908865 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908873 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908880 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908887 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908896 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908903 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908910 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908916 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908921 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908927 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908934 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908941 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908947 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908953 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908959 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908964 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908971 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908977 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908982 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908988 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.908995 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909002 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909008 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909013 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909018 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909024 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909029 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909035 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909039 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909046 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909051 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909057 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909062 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909068 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909073 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909078 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909084 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909090 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909099 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909105 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909111 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909118 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909124 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909129 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909134 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909139 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909145 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909150 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909156 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909161 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909166 4907 feature_gate.go:330] unrecognized feature gate: Example Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909172 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909178 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909183 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909189 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909194 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909210 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.909218 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909388 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909399 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909406 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909413 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909418 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909425 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909430 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909435 4907 feature_gate.go:330] unrecognized feature gate: Example Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909441 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909446 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909452 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909458 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909482 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909489 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909494 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909500 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909505 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909513 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909519 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909525 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909531 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909538 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909544 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909550 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909555 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909560 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909565 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909571 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909576 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909581 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909586 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909591 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909596 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909601 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909607 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909613 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909618 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909623 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909628 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909633 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909639 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909645 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909651 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909657 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909663 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909670 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909675 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909681 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909686 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909692 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909698 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909703 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909708 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909713 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909718 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909723 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909729 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909734 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909739 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909744 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909750 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909756 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909761 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909766 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909772 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909777 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909782 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909787 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909792 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909797 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 19:28:34 crc kubenswrapper[4907]: W1009 19:28:34.909804 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.909812 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.913213 4907 server.go:940] "Client rotation is on, will bootstrap in background" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.917672 4907 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.918459 4907 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.920431 4907 server.go:997] "Starting client certificate rotation" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.920484 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.920665 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-15 17:59:19.139102265 +0000 UTC Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.920772 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1606h30m44.218334109s for next certificate rotation Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.948766 4907 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.951070 4907 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 19:28:34 crc kubenswrapper[4907]: I1009 19:28:34.969634 4907 log.go:25] "Validated CRI v1 runtime API" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.012956 4907 log.go:25] "Validated CRI v1 image API" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.015339 4907 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.026122 4907 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-09-19-24-08-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.026776 4907 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.053652 4907 manager.go:217] Machine: {Timestamp:2025-10-09 19:28:35.050212845 +0000 UTC m=+0.582180364 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:de5ae157-82cf-491d-b46e-a75d3a70699d BootID:18e2d302-c2fb-4ade-9fd1-bc58926be156 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0e:9d:d9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0e:9d:d9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:90:14:30 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ab:fb:26 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:cb:68:d9 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:35:05:43 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4a:d4:12:b0:12:9c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:4a:eb:a6:37:a0:83 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.054587 4907 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.054971 4907 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.056290 4907 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.056628 4907 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.056719 4907 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.057647 4907 topology_manager.go:138] "Creating topology manager with none policy" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.057716 4907 container_manager_linux.go:303] "Creating device plugin manager" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.058196 4907 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.058297 4907 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.058701 4907 state_mem.go:36] "Initialized new in-memory state store" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.058899 4907 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.062248 4907 kubelet.go:418] "Attempting to sync node with API server" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.062362 4907 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.062478 4907 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.062557 4907 kubelet.go:324] "Adding apiserver pod source" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.062635 4907 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.066706 4907 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.067661 4907 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.069100 4907 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 09 19:28:35 crc kubenswrapper[4907]: W1009 19:28:35.070879 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.071004 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:35 crc kubenswrapper[4907]: W1009 19:28:35.070950 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071080 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.071104 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071119 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071168 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071189 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071223 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071244 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071262 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071285 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071305 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071323 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071374 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.071395 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.072338 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.073237 4907 server.go:1280] "Started kubelet" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.074579 4907 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.074863 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.074568 4907 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.075842 4907 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 09 19:28:35 crc systemd[1]: Started Kubernetes Kubelet. Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.077552 4907 server.go:460] "Adding debug handlers to kubelet server" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.078098 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.078161 4907 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.078635 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 21:40:02.993428932 +0000 UTC Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.081340 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1010h11m27.912097608s for next certificate rotation Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.082060 4907 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.082095 4907 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.082969 4907 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.081719 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.104:6443: connect: connection refused" interval="200ms" Oct 09 19:28:35 crc kubenswrapper[4907]: W1009 19:28:35.083551 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.083666 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.078604 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.083394 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.104:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ce9573015aa9d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-09 19:28:35.073174173 +0000 UTC m=+0.605141702,LastTimestamp:2025-10-09 19:28:35.073174173 +0000 UTC m=+0.605141702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.092988 4907 factory.go:55] Registering systemd factory Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.093856 4907 factory.go:221] Registration of the systemd container factory successfully Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.094626 4907 factory.go:153] Registering CRI-O factory Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.094670 4907 factory.go:221] Registration of the crio container factory successfully Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.094800 4907 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.094874 4907 factory.go:103] Registering Raw factory Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.094921 4907 manager.go:1196] Started watching for new ooms in manager Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.096032 4907 manager.go:319] Starting recovery of all containers Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099059 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099139 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099165 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099187 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099208 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099229 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099250 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099273 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099757 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099825 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099848 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099869 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099891 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.099922 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100016 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100048 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100070 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100092 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100115 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100143 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100166 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100188 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100209 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100232 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100254 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100278 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100305 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100332 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100365 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100395 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100426 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100507 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100528 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100633 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100668 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100689 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100713 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100738 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100758 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100782 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100803 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100827 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100849 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100873 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100894 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100916 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100959 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.100989 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101011 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101031 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101054 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101078 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101108 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101133 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101156 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101180 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101202 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101223 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101244 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101264 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101287 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101307 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101331 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101351 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101372 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101393 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101416 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101437 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101487 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101509 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101530 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101554 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101583 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101605 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101626 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101647 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101666 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101689 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101719 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101744 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101764 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101785 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101813 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101834 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101853 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101874 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101906 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101927 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101948 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101969 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.101989 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102010 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102033 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102053 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102117 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102138 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102159 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102183 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102205 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102226 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102251 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102271 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102292 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102312 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102357 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102553 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102588 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102620 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102645 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102668 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102694 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102716 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102741 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102762 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102784 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102803 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102824 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102846 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102867 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102888 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102909 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102930 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102960 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.102982 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103002 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103022 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103046 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103066 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103090 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103117 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103146 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103174 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.103202 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.105823 4907 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.105878 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.105906 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.105930 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.105954 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.105976 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.105996 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106020 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106039 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106068 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106089 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106111 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106131 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106153 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106172 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106194 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106215 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106238 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106260 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106280 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106300 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106325 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106344 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106387 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106406 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106429 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106449 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106516 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106540 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106559 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106579 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106598 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106619 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106638 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106659 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106678 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106697 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106716 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106735 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106756 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106776 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106798 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106819 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106840 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106862 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106884 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106911 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106943 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106971 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.106999 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107028 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107055 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107081 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107109 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107138 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107168 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107194 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107217 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107237 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107258 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107277 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107296 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107317 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107343 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107361 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107382 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107401 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107423 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107443 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107495 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107516 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107537 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107556 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107577 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107596 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107616 4907 reconstruct.go:97] "Volume reconstruction finished" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.107630 4907 reconciler.go:26] "Reconciler: start to sync state" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.121893 4907 manager.go:324] Recovery completed Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.134987 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.137233 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.137264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.137275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.138047 4907 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.138060 4907 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.138078 4907 state_mem.go:36] "Initialized new in-memory state store" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.146253 4907 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.150108 4907 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.150146 4907 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.150175 4907 kubelet.go:2335] "Starting kubelet main sync loop" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.150266 4907 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 09 19:28:35 crc kubenswrapper[4907]: W1009 19:28:35.151156 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.151257 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.159133 4907 policy_none.go:49] "None policy: Start" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.160396 4907 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.160435 4907 state_mem.go:35] "Initializing new in-memory state store" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.185347 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.234857 4907 manager.go:334] "Starting Device Plugin manager" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.234932 4907 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.234958 4907 server.go:79] "Starting device plugin registration server" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.235549 4907 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.235590 4907 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.235779 4907 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.235960 4907 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.235979 4907 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.245783 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.251177 4907 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.251266 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.252313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.252362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.252381 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.252577 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.252950 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.253042 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.253620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.253671 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.253690 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.253821 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.253957 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.254005 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.254246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.254283 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.254298 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.254657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.254723 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.254744 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.255106 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.255218 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.255267 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.255227 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.255347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.255361 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256438 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256482 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256495 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256530 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256761 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256887 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.256925 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.258076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.258099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.258110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.258110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.258148 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.258164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.258278 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.258306 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.259858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.259883 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.259893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.285284 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.104:6443: connect: connection refused" interval="400ms" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.309875 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.309925 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.309971 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310022 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310093 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310154 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310244 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310276 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310583 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310639 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310735 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310780 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.310825 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.336575 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.337854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.337900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.337913 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.337950 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.338547 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.104:6443: connect: connection refused" node="crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412075 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412144 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412177 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412207 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412237 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412252 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412264 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412264 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412328 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412340 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412361 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412357 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412394 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412406 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412411 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412413 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412439 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412451 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412486 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412506 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412521 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412554 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412512 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412636 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412640 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412680 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412681 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.412602 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.539418 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.541176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.541226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.541238 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.541280 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.541986 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.104:6443: connect: connection refused" node="crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.594955 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.599592 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.621245 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.648022 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.653561 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:35 crc kubenswrapper[4907]: W1009 19:28:35.658071 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0cf11e001aece59060b218d970ceff22fcaa72940d349d292c005604a13fdc39 WatchSource:0}: Error finding container 0cf11e001aece59060b218d970ceff22fcaa72940d349d292c005604a13fdc39: Status 404 returned error can't find the container with id 0cf11e001aece59060b218d970ceff22fcaa72940d349d292c005604a13fdc39 Oct 09 19:28:35 crc kubenswrapper[4907]: W1009 19:28:35.660685 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-bcf490336c5dc66e4571eb46b32dc2c746ac4a8c6929f49d58bc9f747ecdb6ae WatchSource:0}: Error finding container bcf490336c5dc66e4571eb46b32dc2c746ac4a8c6929f49d58bc9f747ecdb6ae: Status 404 returned error can't find the container with id bcf490336c5dc66e4571eb46b32dc2c746ac4a8c6929f49d58bc9f747ecdb6ae Oct 09 19:28:35 crc kubenswrapper[4907]: W1009 19:28:35.675123 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a5c852c09c4b2b80ccda152df781c1c806515b864ad05c07b5454987d492e3b2 WatchSource:0}: Error finding container a5c852c09c4b2b80ccda152df781c1c806515b864ad05c07b5454987d492e3b2: Status 404 returned error can't find the container with id a5c852c09c4b2b80ccda152df781c1c806515b864ad05c07b5454987d492e3b2 Oct 09 19:28:35 crc kubenswrapper[4907]: W1009 19:28:35.676172 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9a4747bc6ddc169ec5f16ec87b05b4a6f2be54f499a71444b4cc0761b7ca1907 WatchSource:0}: Error finding container 9a4747bc6ddc169ec5f16ec87b05b4a6f2be54f499a71444b4cc0761b7ca1907: Status 404 returned error can't find the container with id 9a4747bc6ddc169ec5f16ec87b05b4a6f2be54f499a71444b4cc0761b7ca1907 Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.686594 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.104:6443: connect: connection refused" interval="800ms" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.943178 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.945506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.945561 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.945576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:35 crc kubenswrapper[4907]: I1009 19:28:35.945614 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 19:28:35 crc kubenswrapper[4907]: E1009 19:28:35.946258 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.104:6443: connect: connection refused" node="crc" Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.076138 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.156026 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0cf11e001aece59060b218d970ceff22fcaa72940d349d292c005604a13fdc39"} Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.157999 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3a72f6ce98b097d81d453552b1263f57432efe26046ca61a1e27cb2e628b5173"} Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.160352 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9a4747bc6ddc169ec5f16ec87b05b4a6f2be54f499a71444b4cc0761b7ca1907"} Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.162339 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a5c852c09c4b2b80ccda152df781c1c806515b864ad05c07b5454987d492e3b2"} Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.163819 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bcf490336c5dc66e4571eb46b32dc2c746ac4a8c6929f49d58bc9f747ecdb6ae"} Oct 09 19:28:36 crc kubenswrapper[4907]: W1009 19:28:36.205250 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:36 crc kubenswrapper[4907]: E1009 19:28:36.205368 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:36 crc kubenswrapper[4907]: W1009 19:28:36.473322 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:36 crc kubenswrapper[4907]: E1009 19:28:36.473845 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:36 crc kubenswrapper[4907]: E1009 19:28:36.487251 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.104:6443: connect: connection refused" interval="1.6s" Oct 09 19:28:36 crc kubenswrapper[4907]: W1009 19:28:36.579797 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:36 crc kubenswrapper[4907]: E1009 19:28:36.579878 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:36 crc kubenswrapper[4907]: W1009 19:28:36.590506 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:36 crc kubenswrapper[4907]: E1009 19:28:36.590595 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.746941 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.748719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.748774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.748792 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:36 crc kubenswrapper[4907]: I1009 19:28:36.748838 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 19:28:36 crc kubenswrapper[4907]: E1009 19:28:36.749552 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.104:6443: connect: connection refused" node="crc" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.075968 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.172005 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b"} Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.172097 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e"} Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.172114 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde"} Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.172128 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7"} Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.172096 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.173394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.173440 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.173458 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.175399 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9e486e0bd3789894e356b8835028d4ecd1bf0848156531f685a241f092b5cd93" exitCode=0 Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.175531 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9e486e0bd3789894e356b8835028d4ecd1bf0848156531f685a241f092b5cd93"} Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.175651 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.176947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.176994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.177010 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.177762 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d3ff05209d16fb966d03bb41c5943bc7cff7a444bde6e7f126f9ff1d6479854a" exitCode=0 Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.177864 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d3ff05209d16fb966d03bb41c5943bc7cff7a444bde6e7f126f9ff1d6479854a"} Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.177976 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.180153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.180201 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.180230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.184047 4907 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e" exitCode=0 Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.184141 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e"} Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.184169 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.185072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.185099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.185113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.187440 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e" exitCode=0 Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.187530 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e"} Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.187650 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.188889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.188929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.188946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.192511 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.200163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.200226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:37 crc kubenswrapper[4907]: I1009 19:28:37.200252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.005239 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.013725 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.076749 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:38 crc kubenswrapper[4907]: E1009 19:28:38.088969 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.104:6443: connect: connection refused" interval="3.2s" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.193964 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="54824376d71d50fdc730df981b13b52c689691125f1635d73fae9edb9ca59591" exitCode=0 Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.194090 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.194084 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"54824376d71d50fdc730df981b13b52c689691125f1635d73fae9edb9ca59591"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.195122 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.195155 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.195166 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.198233 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0ff791c39ddb973fb41489e48e41803fadf855cf25423f47501c62fbe002cac7"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.198294 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.199260 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.199297 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.199313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.203227 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.203294 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.203314 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.203335 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.204615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.204657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.204672 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.210708 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.210763 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.210781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.210792 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463"} Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.210818 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.214734 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.214804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.214815 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.350489 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.351982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.352026 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.352038 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:38 crc kubenswrapper[4907]: I1009 19:28:38.352071 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 19:28:38 crc kubenswrapper[4907]: E1009 19:28:38.352783 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.104:6443: connect: connection refused" node="crc" Oct 09 19:28:38 crc kubenswrapper[4907]: W1009 19:28:38.468660 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.104:6443: connect: connection refused Oct 09 19:28:38 crc kubenswrapper[4907]: E1009 19:28:38.468772 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.104:6443: connect: connection refused" logger="UnhandledError" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.109488 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.217459 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="39852f841ed98f045e3215238f101af52c7ef784c0976684c006f033d378a969" exitCode=0 Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.217616 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"39852f841ed98f045e3215238f101af52c7ef784c0976684c006f033d378a969"} Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.217664 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.219138 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.219215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.219256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.223869 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078"} Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.223906 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.223974 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.223996 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.224007 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.224056 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.224130 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225488 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225828 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225773 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:39 crc kubenswrapper[4907]: I1009 19:28:39.225974 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.079015 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.232875 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7d4591279766a8da5539891803a3abd39fc8ab0522ff21c570a0d61513b0f57"} Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.232933 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.233021 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.232933 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.233099 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.232933 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7dd7324a85578c127decf2b1c7f641552e50086fc9fa8078ced3d98c5ca7af5f"} Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.233161 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"765eeb261d0e387fb63824327d530f70bd8c6625791c5f8f4572a9ac2f1b2ddf"} Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.233184 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be43eb572c8b1c69ebf93339bf933ac1fca5d434356f409da42b1439ac566162"} Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.234363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.234399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.234409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.234448 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.234517 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.234533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.294245 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.294982 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.296577 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.296640 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:40 crc kubenswrapper[4907]: I1009 19:28:40.296669 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.243407 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"29922ea66d77f8e78a4c5fe940c41f7013ce3ca20128e0ec2967a3b7869c2889"} Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.243461 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.243575 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.243599 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.245329 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.245359 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.245382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.245398 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.245417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.245415 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.271154 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.271362 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.271419 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.272695 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.272754 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.272780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.553494 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.554856 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.554892 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.554902 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.554929 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.569325 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 09 19:28:41 crc kubenswrapper[4907]: I1009 19:28:41.575744 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.110198 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.110333 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.245783 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.245840 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.245851 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.246990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.247095 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.247113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.247151 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.247126 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.247169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:42 crc kubenswrapper[4907]: I1009 19:28:42.993218 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.248674 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.249857 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.249889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.249903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.318858 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.319120 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.320892 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.320936 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:43 crc kubenswrapper[4907]: I1009 19:28:43.320956 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.251820 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.252858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.252917 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.252949 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.283227 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.283385 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.284286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.284317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:44 crc kubenswrapper[4907]: I1009 19:28:44.284328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:45 crc kubenswrapper[4907]: E1009 19:28:45.245949 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 19:28:48 crc kubenswrapper[4907]: W1009 19:28:48.975659 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 19:28:48 crc kubenswrapper[4907]: I1009 19:28:48.975776 4907 trace.go:236] Trace[225848574]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 19:28:38.974) (total time: 10001ms): Oct 09 19:28:48 crc kubenswrapper[4907]: Trace[225848574]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:28:48.975) Oct 09 19:28:48 crc kubenswrapper[4907]: Trace[225848574]: [10.001505594s] [10.001505594s] END Oct 09 19:28:48 crc kubenswrapper[4907]: E1009 19:28:48.975803 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 19:28:49 crc kubenswrapper[4907]: I1009 19:28:49.077122 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 09 19:28:49 crc kubenswrapper[4907]: W1009 19:28:49.100733 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 19:28:49 crc kubenswrapper[4907]: I1009 19:28:49.100839 4907 trace.go:236] Trace[916872589]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 19:28:39.096) (total time: 10004ms): Oct 09 19:28:49 crc kubenswrapper[4907]: Trace[916872589]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10004ms (19:28:49.100) Oct 09 19:28:49 crc kubenswrapper[4907]: Trace[916872589]: [10.004495089s] [10.004495089s] END Oct 09 19:28:49 crc kubenswrapper[4907]: E1009 19:28:49.100870 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 19:28:49 crc kubenswrapper[4907]: E1009 19:28:49.242055 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.186ce9573015aa9d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-09 19:28:35.073174173 +0000 UTC m=+0.605141702,LastTimestamp:2025-10-09 19:28:35.073174173 +0000 UTC m=+0.605141702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 09 19:28:49 crc kubenswrapper[4907]: W1009 19:28:49.309824 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 19:28:49 crc kubenswrapper[4907]: I1009 19:28:49.310011 4907 trace.go:236] Trace[1759831780]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 19:28:39.306) (total time: 10003ms): Oct 09 19:28:49 crc kubenswrapper[4907]: Trace[1759831780]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10003ms (19:28:49.309) Oct 09 19:28:49 crc kubenswrapper[4907]: Trace[1759831780]: [10.003526225s] [10.003526225s] END Oct 09 19:28:49 crc kubenswrapper[4907]: E1009 19:28:49.310049 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 19:28:49 crc kubenswrapper[4907]: I1009 19:28:49.472822 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 19:28:49 crc kubenswrapper[4907]: I1009 19:28:49.473111 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 19:28:49 crc kubenswrapper[4907]: I1009 19:28:49.487398 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 19:28:49 crc kubenswrapper[4907]: I1009 19:28:49.487507 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 19:28:51 crc kubenswrapper[4907]: I1009 19:28:51.582994 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:51 crc kubenswrapper[4907]: I1009 19:28:51.584365 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:51 crc kubenswrapper[4907]: I1009 19:28:51.586768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:51 crc kubenswrapper[4907]: I1009 19:28:51.586831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:51 crc kubenswrapper[4907]: I1009 19:28:51.586845 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:51 crc kubenswrapper[4907]: I1009 19:28:51.589205 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:52 crc kubenswrapper[4907]: I1009 19:28:52.110812 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 19:28:52 crc kubenswrapper[4907]: I1009 19:28:52.110896 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 19:28:52 crc kubenswrapper[4907]: I1009 19:28:52.277372 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:52 crc kubenswrapper[4907]: I1009 19:28:52.278156 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:52 crc kubenswrapper[4907]: I1009 19:28:52.278197 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:52 crc kubenswrapper[4907]: I1009 19:28:52.278206 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.017334 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.017569 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.018939 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.018984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.018999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.022727 4907 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.031722 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.279130 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.279978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.280029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.280042 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.324115 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.324304 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.325614 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.325664 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:28:53 crc kubenswrapper[4907]: I1009 19:28:53.325674 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:28:54 crc kubenswrapper[4907]: E1009 19:28:54.469679 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 09 19:28:54 crc kubenswrapper[4907]: E1009 19:28:54.473066 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.475159 4907 trace.go:236] Trace[1555699244]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 19:28:43.768) (total time: 10706ms): Oct 09 19:28:54 crc kubenswrapper[4907]: Trace[1555699244]: ---"Objects listed" error: 10706ms (19:28:54.475) Oct 09 19:28:54 crc kubenswrapper[4907]: Trace[1555699244]: [10.706975407s] [10.706975407s] END Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.475187 4907 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.475199 4907 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.527783 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44654->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.527868 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44654->192.168.126.11:17697: read: connection reset by peer" Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.528301 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.528332 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.528565 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.528591 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 09 19:28:54 crc kubenswrapper[4907]: I1009 19:28:54.663275 4907 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.076009 4907 apiserver.go:52] "Watching apiserver" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.083013 4907 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.083519 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-multus/multus-hns2h","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-dns/node-resolver-2n5kb","openshift-machine-config-operator/machine-config-daemon-v2wbt","openshift-multus/multus-additional-cni-plugins-z8tzv","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-node-t8m7t"] Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.084168 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.084309 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.084437 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.084534 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.084920 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.084975 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.084998 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.085042 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.085199 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.085241 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.085304 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.085581 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2n5kb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.085793 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.086030 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.086881 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.086890 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.087019 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.088567 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.089071 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.089089 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.089874 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.090035 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.091567 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.091849 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.091789 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.092006 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.092534 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.093525 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.092557 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.095568 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.095888 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.096136 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.096254 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.096583 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.096675 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.095585 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.097263 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.097753 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.098219 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.102167 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.102663 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.102765 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.103079 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.103302 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.103621 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.125271 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.139384 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.151860 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.165655 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.180086 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.184507 4907 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.192570 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.202019 4907 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.204510 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.224549 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.236130 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.263485 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278690 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278751 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278778 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278804 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278832 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278856 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278874 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278893 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278912 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278933 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278953 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278977 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.278999 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279015 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279032 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279047 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279068 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279097 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279115 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279131 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279149 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279171 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279195 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279216 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279269 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279293 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279323 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279311 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279341 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279449 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279516 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279547 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279578 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279610 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279633 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279656 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279685 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279709 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279732 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279738 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279761 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279790 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279817 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279840 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279868 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279893 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279912 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279933 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279954 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.279982 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280005 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280025 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280050 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280071 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280091 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280156 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280182 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280208 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280235 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280258 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280283 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280310 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280336 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280360 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280384 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280414 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280438 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280480 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280506 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280531 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280554 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280605 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280635 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280660 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280685 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280708 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280756 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280777 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280800 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280823 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280848 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280873 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280976 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281004 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281030 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281056 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281082 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281108 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281134 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281161 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281186 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281214 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281242 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281272 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281298 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281323 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281350 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281379 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281402 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281434 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281483 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281515 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281541 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281567 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281592 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281617 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281642 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281667 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281692 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281720 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281745 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281770 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281794 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281818 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281843 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281867 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281892 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281919 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281943 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281965 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281985 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282011 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282034 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282059 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282087 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282113 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282138 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282162 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282186 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282213 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282237 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282263 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282286 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282313 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282340 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282364 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282387 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282425 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282449 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282608 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282643 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282674 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282701 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282731 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282758 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282787 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282907 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282943 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282969 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283026 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283057 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283083 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283111 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283144 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283172 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283194 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283223 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283251 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283275 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283297 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283331 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283357 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283384 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283409 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283436 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283485 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283515 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283543 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283574 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283604 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283634 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283663 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283690 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283766 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283798 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283826 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283852 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283878 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283905 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283931 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283957 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283985 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284014 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284043 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284068 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284089 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284108 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284128 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284146 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284167 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284188 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284210 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284233 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284280 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284300 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284318 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284339 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.295275 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.299633 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302901 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078" exitCode=255 Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302949 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078"} Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.316406 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280164 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280210 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280564 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280796 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.280898 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281018 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281147 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.319159 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.319276 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281192 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281326 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281425 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281544 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281712 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282291 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282669 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282766 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282926 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.282991 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283079 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283199 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283388 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283417 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283518 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283542 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283701 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.283808 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284033 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284094 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284139 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284208 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.284484 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:28:55.7844392 +0000 UTC m=+21.316406829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.319904 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/232fe335-3cd6-4fb1-b335-07fbfe64c940-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.319922 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.320100 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrt2n\" (UniqueName: \"kubernetes.io/projected/232fe335-3cd6-4fb1-b335-07fbfe64c940-kube-api-access-zrt2n\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.320141 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/717141fe-c68d-4844-ad99-872d296a6370-rootfs\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.281164 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284563 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284566 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284601 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284887 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284926 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.284942 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.285167 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.285549 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.320385 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.285587 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.286181 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.286499 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.287724 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.319514 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.287808 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.287982 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.288054 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.288701 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.289056 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.289295 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.289505 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.289687 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.289708 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.289875 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.290146 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.290301 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.290319 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.290478 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.290404 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.290817 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.291173 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.291431 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.292042 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.292237 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.292446 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.292685 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.292743 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.292846 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.292895 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.293153 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.293181 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.296939 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.297057 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.297196 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.297361 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.297369 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.297543 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.297646 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.297691 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.298560 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.298860 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.300446 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.301175 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.301546 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.301640 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.301751 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.301866 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302011 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302092 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302222 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.320754 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48bed29d-cec4-4051-98da-e4a5547f1827-hosts-file\") pod \"node-resolver-2n5kb\" (UID: \"48bed29d-cec4-4051-98da-e4a5547f1827\") " pod="openshift-dns/node-resolver-2n5kb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302277 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302312 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302438 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302457 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.302871 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.303045 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.303773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.303780 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.303845 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.304177 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.304452 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.304786 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.305002 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.305137 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.305199 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.305507 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.306166 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.306535 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.307094 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.307356 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.307574 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.307809 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.307922 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.308220 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.308526 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.309147 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.304840 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.309508 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.309531 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.309544 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.309593 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.309854 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.312215 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.312543 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.312673 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.313076 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.313428 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.313534 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.314676 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.314778 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.315132 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.320989 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-kubelet\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.315222 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.315788 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.316083 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.316354 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.316596 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.316437 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.317715 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.317910 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.317874 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318109 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318089 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318138 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318170 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318216 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318422 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318565 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318696 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.318974 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.319022 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.284507 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.320546 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.320821 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.321149 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.321815 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:55.82178471 +0000 UTC m=+21.353752219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.321824 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.321865 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-var-lib-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.321874 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.321900 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-config\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322045 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovn-node-metrics-cert\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322092 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322204 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-script-lib\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322342 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-cnibin\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322478 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322604 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-cnibin\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322627 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg9zk\" (UniqueName: \"kubernetes.io/projected/48bed29d-cec4-4051-98da-e4a5547f1827-kube-api-access-hg9zk\") pod \"node-resolver-2n5kb\" (UID: \"48bed29d-cec4-4051-98da-e4a5547f1827\") " pod="openshift-dns/node-resolver-2n5kb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322646 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-conf-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322768 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-multus-certs\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322846 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vd7\" (UniqueName: \"kubernetes.io/projected/717141fe-c68d-4844-ad99-872d296a6370-kube-api-access-l5vd7\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322914 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.322934 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-node-log\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.323020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-bin\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.323836 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324057 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-system-cni-dir\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324288 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-cni-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324373 4907 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-etc-kubernetes\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324755 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-slash\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324821 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-cni-bin\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324854 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324888 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-ovn-kubernetes\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324917 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/717141fe-c68d-4844-ad99-872d296a6370-mcd-auth-proxy-config\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324971 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64344fcc-f9f2-424f-a32b-44927641b614-multus-daemon-config\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.324990 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325010 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-systemd\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325046 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-log-socket\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325071 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325094 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325132 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-ovn\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325160 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-netd\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325204 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-env-overrides\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325225 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325253 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-os-release\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325315 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-cni-multus\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/717141fe-c68d-4844-ad99-872d296a6370-proxy-tls\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325397 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-systemd-units\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325413 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325450 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/232fe335-3cd6-4fb1-b335-07fbfe64c940-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325490 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64344fcc-f9f2-424f-a32b-44927641b614-cni-binary-copy\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325507 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-hostroot\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325524 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdh2\" (UniqueName: \"kubernetes.io/projected/64344fcc-f9f2-424f-a32b-44927641b614-kube-api-access-kxdh2\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325572 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-system-cni-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325596 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325590 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325621 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325826 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-k8s-cni-cncf-io\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.325898 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325915 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.325967 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.325985 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:55.825960123 +0000 UTC m=+21.357927602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326032 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8n28\" (UniqueName: \"kubernetes.io/projected/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-kube-api-access-p8n28\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326286 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-os-release\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326328 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-kubelet\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326372 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326375 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326406 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-netns\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326441 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-etc-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326507 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-socket-dir-parent\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326553 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-netns\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326753 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326758 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326908 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327048 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327112 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327176 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327645 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327725 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327781 4907 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327849 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327915 4907 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327978 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.328033 4907 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.328619 4907 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.328709 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.328773 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.328837 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.328894 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.328996 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329448 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329560 4907 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329624 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329686 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329743 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329801 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329858 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329910 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.329973 4907 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.330035 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.330092 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.330149 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.330202 4907 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.330255 4907 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331223 4907 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331307 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331369 4907 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331423 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331503 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331568 4907 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331629 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331686 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331742 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331900 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.331959 4907 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332018 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332075 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332131 4907 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332184 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332234 4907 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332285 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332341 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332393 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332449 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332534 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332589 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332651 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332719 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332778 4907 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332839 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.332923 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333017 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333091 4907 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333159 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333219 4907 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333279 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333340 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333403 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333486 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.326838 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333556 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333630 4907 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333648 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333666 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333680 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.327137 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333692 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333792 4907 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333817 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333829 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333840 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333851 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333865 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333878 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333888 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333899 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333909 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333921 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333932 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333942 4907 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333953 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333964 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333974 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333983 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.333992 4907 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334004 4907 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334017 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334028 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334039 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334049 4907 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334059 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334069 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334082 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334095 4907 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334106 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334118 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334128 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334138 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334148 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334157 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334166 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334177 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334186 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334197 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334207 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334216 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334225 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334235 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334244 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334254 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334264 4907 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334274 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334335 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334384 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334398 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334410 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334421 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334431 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334442 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334451 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334461 4907 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334484 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334496 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334507 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334517 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334527 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334537 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334546 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334557 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334569 4907 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334581 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334591 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334602 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334613 4907 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334622 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334635 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334647 4907 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334658 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334670 4907 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334679 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334689 4907 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334701 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334712 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334723 4907 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334735 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334744 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334756 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334765 4907 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334774 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334783 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334794 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334804 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334814 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334835 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334844 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334854 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334863 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.334394 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.335906 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.336429 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.336729 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.337588 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.338180 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.338430 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.338903 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.338932 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.338947 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.339021 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:55.838991998 +0000 UTC m=+21.370959697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.339732 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.340224 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.340529 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.340621 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.340638 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.340652 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.340708 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:55.84068663 +0000 UTC m=+21.372654119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.341830 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.343274 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.343840 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.345047 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.345236 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.346981 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.348763 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.350119 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.355599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.356195 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.360125 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.360391 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.360582 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.360665 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.361169 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.361454 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.363329 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.363339 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.363481 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.363516 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.363528 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.364075 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.364396 4907 scope.go:117] "RemoveContainer" containerID="63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.364614 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.363668 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.366364 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.366756 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.377227 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.387549 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.394715 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.395621 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.402026 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.414710 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.422457 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.433992 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435327 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-etc-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435450 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-os-release\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435545 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-etc-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435597 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-os-release\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435562 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-kubelet\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435673 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-netns\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435696 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-socket-dir-parent\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435714 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-netns\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435738 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/717141fe-c68d-4844-ad99-872d296a6370-rootfs\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435771 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/232fe335-3cd6-4fb1-b335-07fbfe64c940-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435792 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrt2n\" (UniqueName: \"kubernetes.io/projected/232fe335-3cd6-4fb1-b335-07fbfe64c940-kube-api-access-zrt2n\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-socket-dir-parent\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435813 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovn-node-metrics-cert\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435832 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-script-lib\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435845 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-netns\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-cnibin\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435872 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48bed29d-cec4-4051-98da-e4a5547f1827-hosts-file\") pod \"node-resolver-2n5kb\" (UID: \"48bed29d-cec4-4051-98da-e4a5547f1827\") " pod="openshift-dns/node-resolver-2n5kb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-kubelet\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435913 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-var-lib-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435931 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-config\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435953 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-cnibin\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435968 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-conf-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.435984 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-multus-certs\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436004 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg9zk\" (UniqueName: \"kubernetes.io/projected/48bed29d-cec4-4051-98da-e4a5547f1827-kube-api-access-hg9zk\") pod \"node-resolver-2n5kb\" (UID: \"48bed29d-cec4-4051-98da-e4a5547f1827\") " pod="openshift-dns/node-resolver-2n5kb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436020 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-bin\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436041 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-system-cni-dir\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436064 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-cni-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436082 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-etc-kubernetes\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436100 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vd7\" (UniqueName: \"kubernetes.io/projected/717141fe-c68d-4844-ad99-872d296a6370-kube-api-access-l5vd7\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436120 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-node-log\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436141 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-cni-bin\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436159 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-slash\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436177 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436195 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/717141fe-c68d-4844-ad99-872d296a6370-mcd-auth-proxy-config\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436213 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-ovn-kubernetes\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436233 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64344fcc-f9f2-424f-a32b-44927641b614-multus-daemon-config\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436250 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436269 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-systemd\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436286 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-log-socket\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-os-release\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436322 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436337 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-ovn\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436353 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-netd\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436383 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-env-overrides\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437399 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/232fe335-3cd6-4fb1-b335-07fbfe64c940-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64344fcc-f9f2-424f-a32b-44927641b614-cni-binary-copy\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437537 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-cni-multus\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437569 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/717141fe-c68d-4844-ad99-872d296a6370-proxy-tls\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437589 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-systemd-units\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437621 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437652 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-system-cni-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437681 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-hostroot\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437708 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdh2\" (UniqueName: \"kubernetes.io/projected/64344fcc-f9f2-424f-a32b-44927641b614-kube-api-access-kxdh2\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437749 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437774 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-k8s-cni-cncf-io\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437809 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8n28\" (UniqueName: \"kubernetes.io/projected/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-kube-api-access-p8n28\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437895 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437911 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437894 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/232fe335-3cd6-4fb1-b335-07fbfe64c940-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.437929 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438073 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438105 4907 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438152 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438171 4907 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438196 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438213 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438228 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438244 4907 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438266 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438281 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438296 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438310 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438332 4907 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438349 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438363 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438383 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438395 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438409 4907 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438423 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438443 4907 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438462 4907 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438502 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438517 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438538 4907 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438552 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438563 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-node-log\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438656 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-cni-bin\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438692 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-slash\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438722 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438743 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438747 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438777 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.438795 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.439096 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-kubelet\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.439272 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-netns\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.439336 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/717141fe-c68d-4844-ad99-872d296a6370-rootfs\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.436559 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-etc-kubernetes\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.439428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-ovn-kubernetes\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440259 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64344fcc-f9f2-424f-a32b-44927641b614-multus-daemon-config\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440350 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440397 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-systemd\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440437 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-log-socket\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440704 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-netd\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440787 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-os-release\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440832 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440879 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-ovn\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.440895 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.441853 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/717141fe-c68d-4844-ad99-872d296a6370-mcd-auth-proxy-config\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.441813 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-env-overrides\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.441968 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-system-cni-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.441987 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-hostroot\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442045 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-cnibin\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442188 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-k8s-cni-cncf-io\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442266 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/232fe335-3cd6-4fb1-b335-07fbfe64c940-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442429 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-config\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442492 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-var-lib-cni-multus\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442808 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-kubelet\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442590 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-cnibin\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442640 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-conf-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442844 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48bed29d-cec4-4051-98da-e4a5547f1827-hosts-file\") pod \"node-resolver-2n5kb\" (UID: \"48bed29d-cec4-4051-98da-e4a5547f1827\") " pod="openshift-dns/node-resolver-2n5kb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442686 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442802 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-systemd-units\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442718 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-host-run-multus-certs\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442902 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-var-lib-openvswitch\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442927 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/232fe335-3cd6-4fb1-b335-07fbfe64c940-system-cni-dir\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442957 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-bin\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.442998 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64344fcc-f9f2-424f-a32b-44927641b614-multus-cni-dir\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.443348 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64344fcc-f9f2-424f-a32b-44927641b614-cni-binary-copy\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.443640 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-script-lib\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.456918 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/717141fe-c68d-4844-ad99-872d296a6370-proxy-tls\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.458036 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8n28\" (UniqueName: \"kubernetes.io/projected/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-kube-api-access-p8n28\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.461390 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vd7\" (UniqueName: \"kubernetes.io/projected/717141fe-c68d-4844-ad99-872d296a6370-kube-api-access-l5vd7\") pod \"machine-config-daemon-v2wbt\" (UID: \"717141fe-c68d-4844-ad99-872d296a6370\") " pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.463837 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovn-node-metrics-cert\") pod \"ovnkube-node-t8m7t\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.464612 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.465542 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdh2\" (UniqueName: \"kubernetes.io/projected/64344fcc-f9f2-424f-a32b-44927641b614-kube-api-access-kxdh2\") pod \"multus-hns2h\" (UID: \"64344fcc-f9f2-424f-a32b-44927641b614\") " pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.471498 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.473097 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrt2n\" (UniqueName: \"kubernetes.io/projected/232fe335-3cd6-4fb1-b335-07fbfe64c940-kube-api-access-zrt2n\") pod \"multus-additional-cni-plugins-z8tzv\" (UID: \"232fe335-3cd6-4fb1-b335-07fbfe64c940\") " pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.475223 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg9zk\" (UniqueName: \"kubernetes.io/projected/48bed29d-cec4-4051-98da-e4a5547f1827-kube-api-access-hg9zk\") pod \"node-resolver-2n5kb\" (UID: \"48bed29d-cec4-4051-98da-e4a5547f1827\") " pod="openshift-dns/node-resolver-2n5kb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.479156 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.479712 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.488992 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.495670 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: W1009 19:28:55.497632 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod717141fe_c68d_4844_ad99_872d296a6370.slice/crio-8a57c44b93c5ca1e5168751d9e4b64886f9b8bc54acb0ecda5bc12d393629892 WatchSource:0}: Error finding container 8a57c44b93c5ca1e5168751d9e4b64886f9b8bc54acb0ecda5bc12d393629892: Status 404 returned error can't find the container with id 8a57c44b93c5ca1e5168751d9e4b64886f9b8bc54acb0ecda5bc12d393629892 Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.514926 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.525333 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.535673 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.545418 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.710630 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.721144 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:28:55 crc kubenswrapper[4907]: W1009 19:28:55.725706 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-016aba9ba3a6dff64bb7e6c4493b08562fb38152738e5b7d7dba5d49570f3dbb WatchSource:0}: Error finding container 016aba9ba3a6dff64bb7e6c4493b08562fb38152738e5b7d7dba5d49570f3dbb: Status 404 returned error can't find the container with id 016aba9ba3a6dff64bb7e6c4493b08562fb38152738e5b7d7dba5d49570f3dbb Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.735205 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.744475 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hns2h" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.756087 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2n5kb" Oct 09 19:28:55 crc kubenswrapper[4907]: W1009 19:28:55.761739 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-8b5f22d3246d516b51b157cb9f357a5a9c92b83e6822c9489f3f6151c6736577 WatchSource:0}: Error finding container 8b5f22d3246d516b51b157cb9f357a5a9c92b83e6822c9489f3f6151c6736577: Status 404 returned error can't find the container with id 8b5f22d3246d516b51b157cb9f357a5a9c92b83e6822c9489f3f6151c6736577 Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.842074 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.842185 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.842226 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.842251 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:55 crc kubenswrapper[4907]: I1009 19:28:55.842268 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842427 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842452 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842484 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842549 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:56.842530785 +0000 UTC m=+22.374498274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842612 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842671 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842731 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:56.842706749 +0000 UTC m=+22.374674238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842756 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:56.84274565 +0000 UTC m=+22.374713139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842619 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842778 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:28:56.842769841 +0000 UTC m=+22.374737330 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842790 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842808 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:55 crc kubenswrapper[4907]: E1009 19:28:55.842839 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:56.842833282 +0000 UTC m=+22.374800761 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.151453 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.151639 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.307585 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.307637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.307647 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2b2b435cd42ab922fafa1ea48fb7ac450212ddbae16e065b15fdc74b3757d14f"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.309866 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.311563 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.312227 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.313826 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8b5f22d3246d516b51b157cb9f357a5a9c92b83e6822c9489f3f6151c6736577"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.315362 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.315451 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"016aba9ba3a6dff64bb7e6c4493b08562fb38152738e5b7d7dba5d49570f3dbb"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.318350 4907 generic.go:334] "Generic (PLEG): container finished" podID="232fe335-3cd6-4fb1-b335-07fbfe64c940" containerID="0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616" exitCode=0 Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.318420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerDied","Data":"0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.318451 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerStarted","Data":"96663517d432b641b9071a11be9fc1d5f201915bce13dbfa893d0a55929ed010"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.320137 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2n5kb" event={"ID":"48bed29d-cec4-4051-98da-e4a5547f1827","Type":"ContainerStarted","Data":"64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.320188 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2n5kb" event={"ID":"48bed29d-cec4-4051-98da-e4a5547f1827","Type":"ContainerStarted","Data":"2ac1aec0a11b81251338b1240239d65e28d599446a960cc0931b3ba92353626a"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.323591 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.324005 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hns2h" event={"ID":"64344fcc-f9f2-424f-a32b-44927641b614","Type":"ContainerStarted","Data":"4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.324046 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hns2h" event={"ID":"64344fcc-f9f2-424f-a32b-44927641b614","Type":"ContainerStarted","Data":"bf51ddbc691681a3666430c1700da0ee586fe1b505dcaf5173257fdc4d93a681"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.326265 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9" exitCode=0 Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.326346 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.326391 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"d762407dbdb1d5ce09cb35464d11a7924759dbf47c23a6e647dc36bdc9569405"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.328049 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.328075 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.328085 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"8a57c44b93c5ca1e5168751d9e4b64886f9b8bc54acb0ecda5bc12d393629892"} Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.338290 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.355996 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.367844 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.384634 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.411954 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.435917 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.453950 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.468781 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.486957 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.513955 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.528004 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.544440 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.558227 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.576446 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.594808 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.628138 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.646719 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.661223 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.677939 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.701568 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.719443 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.733636 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.750348 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:56Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.854400 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.854616 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:28:58.854581022 +0000 UTC m=+24.386548521 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.855102 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.855151 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.855183 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:28:56 crc kubenswrapper[4907]: I1009 19:28:56.855224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855330 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855395 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:58.855379281 +0000 UTC m=+24.387346780 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855330 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855418 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855451 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855498 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855513 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855433 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855541 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:58.855517515 +0000 UTC m=+24.387485004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855552 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855574 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:58.855557206 +0000 UTC m=+24.387524905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:56 crc kubenswrapper[4907]: E1009 19:28:56.855604 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 19:28:58.855590307 +0000 UTC m=+24.387557986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.153592 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:57 crc kubenswrapper[4907]: E1009 19:28:57.153730 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.153797 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:28:57 crc kubenswrapper[4907]: E1009 19:28:57.153849 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.156313 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.157153 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.158043 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.181110 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.182134 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.183128 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.217195 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.218173 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.219665 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.220261 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.221314 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.222190 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.222881 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.224085 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.224733 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.225728 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.226341 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.226817 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.227806 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.228399 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.229286 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.229901 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.230374 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.231811 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.232305 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.233537 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.234276 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.235219 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.235827 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.236729 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.237189 4907 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.237290 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.239796 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.240545 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.240970 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.242664 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.244020 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.244589 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.245580 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.246201 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.247031 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.248272 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.249594 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.250379 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.251397 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.252101 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.253229 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.254566 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.255482 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.256086 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.257537 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.258386 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.259531 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.260096 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.333992 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerStarted","Data":"a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584"} Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.351121 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.367091 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.381371 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.400412 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.415644 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.440634 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.452243 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.465974 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.482032 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.509550 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.528069 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.575800 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.614834 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.636968 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.654846 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.666801 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.683519 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.705759 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.724677 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.741861 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.757517 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.773426 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.786731 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:57 crc kubenswrapper[4907]: I1009 19:28:57.803190 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:57Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.151476 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.151621 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.339736 4907 generic.go:334] "Generic (PLEG): container finished" podID="232fe335-3cd6-4fb1-b335-07fbfe64c940" containerID="a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584" exitCode=0 Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.339851 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerDied","Data":"a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584"} Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.343989 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.344039 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.344057 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.344069 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.344084 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.344098 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.356573 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.369859 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.386267 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.396644 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.413265 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.426812 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.439664 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.453705 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.472013 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.483440 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.500318 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.515370 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:58Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.876077 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.876392 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:02.876342859 +0000 UTC m=+28.408310358 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.876687 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.876716 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.876894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:58 crc kubenswrapper[4907]: I1009 19:28:58.876925 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.876854 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877019 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877034 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877040 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:02.877027996 +0000 UTC m=+28.408995495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877047 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.876940 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877081 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877098 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877106 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:02.877090098 +0000 UTC m=+28.409057587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877132 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:02.877123859 +0000 UTC m=+28.409091358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877150 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:28:58 crc kubenswrapper[4907]: E1009 19:28:58.877181 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:02.87717281 +0000 UTC m=+28.409140299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.119454 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.130139 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.137583 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.141835 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.151409 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:28:59 crc kubenswrapper[4907]: E1009 19:28:59.151630 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.151759 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:28:59 crc kubenswrapper[4907]: E1009 19:28:59.151848 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.159958 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.179442 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.192013 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.206044 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.219729 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.237561 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.253920 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.273550 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.289744 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.301041 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.314658 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.332863 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.346181 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.348777 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf"} Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.351273 4907 generic.go:334] "Generic (PLEG): container finished" podID="232fe335-3cd6-4fb1-b335-07fbfe64c940" containerID="17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03" exitCode=0 Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.351341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerDied","Data":"17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03"} Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.363689 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.376642 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.386373 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.398346 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.410414 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.426051 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.447639 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.464397 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.479290 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.492254 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.503120 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.516308 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.529049 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.541310 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.560921 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.572173 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.583634 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.602577 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.616181 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.627134 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.639029 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.648927 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.658756 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:28:59 crc kubenswrapper[4907]: I1009 19:28:59.670950 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:28:59Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.151211 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:00 crc kubenswrapper[4907]: E1009 19:29:00.151342 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.359029 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.361887 4907 generic.go:334] "Generic (PLEG): container finished" podID="232fe335-3cd6-4fb1-b335-07fbfe64c940" containerID="6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d" exitCode=0 Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.362003 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerDied","Data":"6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d"} Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.380608 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.396350 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.412979 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.428033 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.457226 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.472315 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.520150 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.562958 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.580548 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.592542 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.607240 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.621044 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.636747 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.874061 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.875966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.876007 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.876019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.876145 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.883150 4907 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.883550 4907 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.887888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.887951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.887966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.887987 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.888005 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:00Z","lastTransitionTime":"2025-10-09T19:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:00 crc kubenswrapper[4907]: E1009 19:29:00.903720 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.908497 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.908533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.908546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.908561 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.908571 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:00Z","lastTransitionTime":"2025-10-09T19:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:00 crc kubenswrapper[4907]: E1009 19:29:00.922331 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.927511 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.927539 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.927553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.927570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.927581 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:00Z","lastTransitionTime":"2025-10-09T19:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:00 crc kubenswrapper[4907]: E1009 19:29:00.940077 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.943708 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.943732 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.943742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.943760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.943772 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:00Z","lastTransitionTime":"2025-10-09T19:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:00 crc kubenswrapper[4907]: E1009 19:29:00.960184 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.967789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.967831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.967842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.967861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.967875 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:00Z","lastTransitionTime":"2025-10-09T19:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.974331 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-dslfr"] Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.974746 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.977036 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.977158 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.977220 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.977429 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 09 19:29:00 crc kubenswrapper[4907]: E1009 19:29:00.982618 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:00 crc kubenswrapper[4907]: E1009 19:29:00.982725 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.987408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.987444 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.987453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.987500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.987511 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:00Z","lastTransitionTime":"2025-10-09T19:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:00 crc kubenswrapper[4907]: I1009 19:29:00.989620 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:00Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.002375 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.015636 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.027335 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.037085 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.052652 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.064119 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.076569 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.088872 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.090516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.090550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.090560 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.090577 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.090587 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.099881 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5c2j\" (UniqueName: \"kubernetes.io/projected/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-kube-api-access-d5c2j\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.099940 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-host\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.099993 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-serviceca\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.107935 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.129680 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.145877 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.151132 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:01 crc kubenswrapper[4907]: E1009 19:29:01.151265 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.151339 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:01 crc kubenswrapper[4907]: E1009 19:29:01.151389 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.156553 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.168577 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.193495 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.193536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.193548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.193569 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.193579 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.201699 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c2j\" (UniqueName: \"kubernetes.io/projected/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-kube-api-access-d5c2j\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.201785 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-host\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.201861 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-serviceca\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.202479 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-host\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.203665 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-serviceca\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.232907 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5c2j\" (UniqueName: \"kubernetes.io/projected/4eaeec14-bcbe-4871-b6c2-7ebd234c04bc-kube-api-access-d5c2j\") pod \"node-ca-dslfr\" (UID: \"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\") " pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.288817 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dslfr" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.295583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.295602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.295611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.295625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.295633 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:01 crc kubenswrapper[4907]: W1009 19:29:01.364355 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eaeec14_bcbe_4871_b6c2_7ebd234c04bc.slice/crio-97f3f7f804fd90ccc7f59d164f7a09c7b05159e3497e826fd1d717a63af8b410 WatchSource:0}: Error finding container 97f3f7f804fd90ccc7f59d164f7a09c7b05159e3497e826fd1d717a63af8b410: Status 404 returned error can't find the container with id 97f3f7f804fd90ccc7f59d164f7a09c7b05159e3497e826fd1d717a63af8b410 Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.373938 4907 generic.go:334] "Generic (PLEG): container finished" podID="232fe335-3cd6-4fb1-b335-07fbfe64c940" containerID="d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238" exitCode=0 Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.373988 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerDied","Data":"d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238"} Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.388779 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.403689 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.403723 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.403733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.403751 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.403761 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.405742 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.424771 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.440492 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.453653 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.471596 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.489042 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.506633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.506670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.506682 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.506702 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.506713 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.509809 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.529230 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.545640 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.562020 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.579283 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.592979 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.609614 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.609656 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.609664 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.609680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.609691 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.612655 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:01Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.712169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.712208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.712220 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.712238 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.712251 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.814764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.814810 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.814822 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.814841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.814854 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.918218 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.918263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.918276 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.918295 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:01 crc kubenswrapper[4907]: I1009 19:29:01.918308 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:01Z","lastTransitionTime":"2025-10-09T19:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.021819 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.021872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.021884 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.021905 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.021917 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.124593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.124636 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.124646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.124667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.124678 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.151573 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.151874 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.228360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.228415 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.228428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.228452 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.228481 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.331527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.331615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.331628 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.331648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.331690 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.395118 4907 generic.go:334] "Generic (PLEG): container finished" podID="232fe335-3cd6-4fb1-b335-07fbfe64c940" containerID="291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318" exitCode=0 Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.395200 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerDied","Data":"291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.397071 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dslfr" event={"ID":"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc","Type":"ContainerStarted","Data":"ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.397131 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dslfr" event={"ID":"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc","Type":"ContainerStarted","Data":"97f3f7f804fd90ccc7f59d164f7a09c7b05159e3497e826fd1d717a63af8b410"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.410449 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.424703 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.434925 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.434956 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.434974 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.434996 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.435008 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.444945 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.478962 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.497101 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.512293 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.533965 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.539649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.539691 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.539705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.539725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.539738 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.548874 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.567076 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.584542 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.601530 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.616845 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.634361 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.642777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.642829 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.642840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.642860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.642873 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.648670 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.659861 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.672742 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.691256 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.701389 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.712661 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.725302 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.739593 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.746680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.746738 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.746752 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.746774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.746788 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.753118 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.767961 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.785246 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.806447 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.823785 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.842049 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.849424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.849481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.849491 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.849508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.849520 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.859690 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.920992 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.921125 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.921165 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.921192 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921243 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:10.921203747 +0000 UTC m=+36.453171236 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921339 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921356 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921382 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921406 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921410 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:10.921387631 +0000 UTC m=+36.453355220 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921446 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:10.921430922 +0000 UTC m=+36.453398511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921599 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921619 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921636 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.921695 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:10.921682498 +0000 UTC m=+36.453649987 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.921923 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.922016 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:29:02 crc kubenswrapper[4907]: E1009 19:29:02.922055 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:10.922046157 +0000 UTC m=+36.454013756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.953427 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.953482 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.953494 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.953514 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:02 crc kubenswrapper[4907]: I1009 19:29:02.953526 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:02Z","lastTransitionTime":"2025-10-09T19:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.057725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.057775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.057789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.057813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.057825 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.151114 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.151172 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:03 crc kubenswrapper[4907]: E1009 19:29:03.151272 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:03 crc kubenswrapper[4907]: E1009 19:29:03.151433 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.159733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.159777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.159785 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.159800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.159811 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.262744 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.262774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.262784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.262800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.262810 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.365978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.366051 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.366064 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.366083 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.366096 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.405429 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.405688 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.410376 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" event={"ID":"232fe335-3cd6-4fb1-b335-07fbfe64c940","Type":"ContainerStarted","Data":"5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.420790 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.435759 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.435763 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.448124 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.460928 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.468497 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.468538 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.468547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.468564 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.468575 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.475032 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.489107 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.507626 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.522344 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.535864 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.551201 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.564941 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.570691 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.570810 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.571187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.571287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.571381 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.575657 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.588585 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.602877 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.618437 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.632113 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.644283 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.657513 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.673001 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.674320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.674394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.674405 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.674426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.674438 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.685682 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.697776 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.710691 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.730519 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.742173 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.753617 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.763929 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.773116 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.776748 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.776798 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.776818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.776846 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.776865 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.782296 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:03Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.880357 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.880399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.880410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.880429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.880441 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.983548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.983778 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.983921 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.984029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:03 crc kubenswrapper[4907]: I1009 19:29:03.984104 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:03Z","lastTransitionTime":"2025-10-09T19:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.087675 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.087710 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.087719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.087738 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.087747 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.150739 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:04 crc kubenswrapper[4907]: E1009 19:29:04.151212 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.190933 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.191004 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.191023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.191052 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.191072 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.295077 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.295167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.295193 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.295229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.295248 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.397992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.398071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.398094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.398131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.398156 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.413622 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.414088 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.488551 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.501128 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.501168 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.501180 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.501196 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.501206 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.501429 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.515612 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.537564 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.555364 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.581361 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.602087 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.602750 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.602771 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.602779 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.602794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.602803 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.620561 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.643597 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.662882 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.690387 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.705074 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.705127 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.705139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.705157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.705169 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.714088 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.740594 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.762847 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.782031 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:04Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.807658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.807696 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.807706 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.807721 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.807732 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.910018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.910063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.910073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.910090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:04 crc kubenswrapper[4907]: I1009 19:29:04.910101 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:04Z","lastTransitionTime":"2025-10-09T19:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.012909 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.012948 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.012958 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.012976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.012988 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.115956 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.116008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.116020 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.116040 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.116053 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.151349 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.151395 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:05 crc kubenswrapper[4907]: E1009 19:29:05.151677 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:05 crc kubenswrapper[4907]: E1009 19:29:05.151832 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.173778 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.186744 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.198287 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.208678 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.218683 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.218725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.218737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.218755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.218769 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.226844 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.237931 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.251123 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.264746 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.278911 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.290695 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.304701 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.321990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.322160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.322237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.322303 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.322364 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.322607 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.336969 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.353858 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.420955 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/0.log" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.424462 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db" exitCode=1 Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.424574 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.425537 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.425652 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.425715 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.425783 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.425854 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.426112 4907 scope.go:117] "RemoveContainer" containerID="7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.449389 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.462172 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.476943 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.504234 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:05Z\\\",\\\"message\\\":\\\"y (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 19:29:05.361060 6211 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 19:29:05.361400 6211 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.361739 6211 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.361888 6211 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.362005 6211 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.362094 6211 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.362548 6211 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1009 19:29:05.362619 6211 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:05.362664 6211 factory.go:656] Stopping watch factory\\\\nI1009 19:29:05.362683 6211 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:05.362711 6211 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.522123 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.533359 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.533421 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.533439 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.533482 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.533496 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.548521 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.564606 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.579819 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.591455 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.600841 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.617362 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.629876 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.635516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.635563 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.635575 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.635590 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.635602 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.643648 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.658743 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.738108 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.738601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.738670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.738741 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.738809 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.841636 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.841681 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.841697 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.841717 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.841729 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.943877 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.944112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.944178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.944252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:05 crc kubenswrapper[4907]: I1009 19:29:05.944316 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:05Z","lastTransitionTime":"2025-10-09T19:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.046713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.046761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.046775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.046796 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.046810 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.149510 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.149554 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.149565 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.149585 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.149596 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.151135 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:06 crc kubenswrapper[4907]: E1009 19:29:06.151311 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.251939 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.252167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.252231 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.252299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.252354 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.354999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.355050 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.355069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.355091 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.355102 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.429335 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/1.log" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.430100 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/0.log" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.433109 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f" exitCode=1 Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.433168 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.433242 4907 scope.go:117] "RemoveContainer" containerID="7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.434175 4907 scope.go:117] "RemoveContainer" containerID="51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f" Oct 09 19:29:06 crc kubenswrapper[4907]: E1009 19:29:06.436254 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.450547 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.458385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.458417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.458425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.458441 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.458451 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.465479 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.485986 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.499401 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.516108 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.536773 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.562068 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.562100 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.562110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.562128 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.562138 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.565736 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e80de6a17a0f8a771dec9669ab14efddbd41f58dd03b6b871acea892d5559db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:05Z\\\",\\\"message\\\":\\\"y (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 19:29:05.361060 6211 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 19:29:05.361400 6211 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.361739 6211 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.361888 6211 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.362005 6211 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.362094 6211 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1009 19:29:05.362548 6211 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1009 19:29:05.362619 6211 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:05.362664 6211 factory.go:656] Stopping watch factory\\\\nI1009 19:29:05.362683 6211 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:05.362711 6211 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.582237 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.597556 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.611759 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.626061 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.638983 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.651248 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.664626 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.664705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.664723 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.665153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.665177 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.667679 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:06Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.768369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.768408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.768418 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.768437 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.768450 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.871676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.871723 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.871734 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.871752 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.871769 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.974425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.974480 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.974490 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.974506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:06 crc kubenswrapper[4907]: I1009 19:29:06.974516 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:06Z","lastTransitionTime":"2025-10-09T19:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.077486 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.077541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.077552 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.077573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.077588 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.151534 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:07 crc kubenswrapper[4907]: E1009 19:29:07.151687 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.151992 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:07 crc kubenswrapper[4907]: E1009 19:29:07.152295 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.179866 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.179911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.179924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.179943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.179955 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.282055 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.282120 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.282133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.282176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.282190 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.385730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.385786 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.385797 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.385818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.385830 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.437990 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/1.log" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.452160 4907 scope.go:117] "RemoveContainer" containerID="51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f" Oct 09 19:29:07 crc kubenswrapper[4907]: E1009 19:29:07.452550 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.466744 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.478161 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.488543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.488584 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.488597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.488619 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.488632 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.492069 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.509288 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.520879 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.533252 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.546977 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.565969 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.576821 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.588981 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.590955 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.590984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.590995 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.591014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.591025 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.607666 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.620124 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.630731 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.644702 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.688310 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r"] Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.689026 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.691114 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.691225 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.692831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.692871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.692903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.692926 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.692935 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.708224 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.721534 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.734752 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.745870 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.754063 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.769016 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.775680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cpf\" (UniqueName: \"kubernetes.io/projected/c98e5d7e-5d91-4825-a839-86a88cc66d4c-kube-api-access-g6cpf\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.775710 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c98e5d7e-5d91-4825-a839-86a88cc66d4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.775740 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c98e5d7e-5d91-4825-a839-86a88cc66d4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.775768 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c98e5d7e-5d91-4825-a839-86a88cc66d4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.783226 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.795491 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.795527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.795540 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.795560 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.795571 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.797248 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.810163 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.820329 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.833439 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.852000 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.863855 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.876830 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c98e5d7e-5d91-4825-a839-86a88cc66d4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.876929 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cpf\" (UniqueName: \"kubernetes.io/projected/c98e5d7e-5d91-4825-a839-86a88cc66d4c-kube-api-access-g6cpf\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.876962 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c98e5d7e-5d91-4825-a839-86a88cc66d4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.876989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c98e5d7e-5d91-4825-a839-86a88cc66d4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.877748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c98e5d7e-5d91-4825-a839-86a88cc66d4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.877907 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c98e5d7e-5d91-4825-a839-86a88cc66d4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.879045 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.889035 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c98e5d7e-5d91-4825-a839-86a88cc66d4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.896391 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:07Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.897978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.898013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.898025 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.898042 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.898052 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:07Z","lastTransitionTime":"2025-10-09T19:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:07 crc kubenswrapper[4907]: I1009 19:29:07.908450 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cpf\" (UniqueName: \"kubernetes.io/projected/c98e5d7e-5d91-4825-a839-86a88cc66d4c-kube-api-access-g6cpf\") pod \"ovnkube-control-plane-749d76644c-ck44r\" (UID: \"c98e5d7e-5d91-4825-a839-86a88cc66d4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.000926 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.001010 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.001021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.001038 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.001051 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.003144 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.103686 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.103719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.103736 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.103752 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.103762 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.151550 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:08 crc kubenswrapper[4907]: E1009 19:29:08.151711 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.206298 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.206333 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.206342 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.206357 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.206369 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.308523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.308558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.308568 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.308585 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.308598 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.410975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.411022 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.411033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.411052 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.411064 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.445612 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" event={"ID":"c98e5d7e-5d91-4825-a839-86a88cc66d4c","Type":"ContainerStarted","Data":"04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.445664 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" event={"ID":"c98e5d7e-5d91-4825-a839-86a88cc66d4c","Type":"ContainerStarted","Data":"1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.445674 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" event={"ID":"c98e5d7e-5d91-4825-a839-86a88cc66d4c","Type":"ContainerStarted","Data":"266350ed24f536db6d05ebdb27eb5f9f046967c4a858e19e8b76fbd96ecc5767"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.455689 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.467966 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.478554 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.488982 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.499304 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.512935 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.513374 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.513415 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.513425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.513445 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.513477 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.524266 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.536727 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.548500 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.560099 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.575763 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.586519 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.598774 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.608660 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.619007 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.619065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.619076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.619097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.619444 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.623652 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.722149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.722198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.722211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.722230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.722244 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.793280 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sbjsv"] Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.793847 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:08 crc kubenswrapper[4907]: E1009 19:29:08.793913 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.806646 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.816154 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.824342 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.824384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.824400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.824419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.824430 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.829151 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.845805 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.855932 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.867492 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.879599 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.888295 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxw2k\" (UniqueName: \"kubernetes.io/projected/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-kube-api-access-cxw2k\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.888500 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.890310 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.899571 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.909149 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.920692 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.926921 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.926954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.926966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.926985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.926997 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:08Z","lastTransitionTime":"2025-10-09T19:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.935704 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.948582 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.962174 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.974849 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.985094 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:08Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.989571 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxw2k\" (UniqueName: \"kubernetes.io/projected/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-kube-api-access-cxw2k\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:08 crc kubenswrapper[4907]: I1009 19:29:08.989694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:08 crc kubenswrapper[4907]: E1009 19:29:08.989828 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:08 crc kubenswrapper[4907]: E1009 19:29:08.989913 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs podName:06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:09.489895361 +0000 UTC m=+35.021862860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs") pod "network-metrics-daemon-sbjsv" (UID: "06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.006238 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxw2k\" (UniqueName: \"kubernetes.io/projected/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-kube-api-access-cxw2k\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.029961 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.030018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.030028 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.030049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.030063 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.132498 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.132538 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.132547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.132565 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.132576 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.150391 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:09 crc kubenswrapper[4907]: E1009 19:29:09.150529 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.150568 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:09 crc kubenswrapper[4907]: E1009 19:29:09.150614 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.235252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.235291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.235302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.235320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.235331 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.338520 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.338734 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.338812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.338876 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.338930 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.442013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.442052 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.442066 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.442086 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.442100 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.497559 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:09 crc kubenswrapper[4907]: E1009 19:29:09.497716 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:09 crc kubenswrapper[4907]: E1009 19:29:09.497787 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs podName:06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:10.497767897 +0000 UTC m=+36.029735386 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs") pod "network-metrics-daemon-sbjsv" (UID: "06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.544629 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.544709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.544729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.544760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.544780 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.647692 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.648192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.648272 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.648370 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.648433 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.751701 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.751984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.752055 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.752119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.752185 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.854987 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.855107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.855678 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.855765 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.856020 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.961187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.961637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.961657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.961684 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:09 crc kubenswrapper[4907]: I1009 19:29:09.961704 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:09Z","lastTransitionTime":"2025-10-09T19:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.064454 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.064512 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.064527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.064543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.064553 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.150851 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.151010 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:10 crc kubenswrapper[4907]: E1009 19:29:10.151070 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:10 crc kubenswrapper[4907]: E1009 19:29:10.151338 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.168258 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.168294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.168305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.168321 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.168331 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.270620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.270669 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.270678 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.270695 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.270705 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.373255 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.373550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.373637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.373720 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.373797 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.476986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.477020 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.477031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.477048 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.477057 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.508871 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:10 crc kubenswrapper[4907]: E1009 19:29:10.509084 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:10 crc kubenswrapper[4907]: E1009 19:29:10.509166 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs podName:06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:12.509146497 +0000 UTC m=+38.041113986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs") pod "network-metrics-daemon-sbjsv" (UID: "06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.580906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.580943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.580953 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.580972 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.580985 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.683996 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.684031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.684043 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.684063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.684075 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.787185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.787226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.787237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.787255 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.787264 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.890205 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.890244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.890254 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.890271 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.890281 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.993962 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.994018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.994035 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.994061 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:10 crc kubenswrapper[4907]: I1009 19:29:10.994079 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:10Z","lastTransitionTime":"2025-10-09T19:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.014917 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.015050 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.015121 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.015172 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.015217 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015370 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015400 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015435 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:27.015392912 +0000 UTC m=+52.547360451 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015514 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015546 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:27.015526815 +0000 UTC m=+52.547494354 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015450 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015585 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015602 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015634 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015655 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015610 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:27.015581726 +0000 UTC m=+52.547549255 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015834 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:27.015752001 +0000 UTC m=+52.547719490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.015861 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:27.015852003 +0000 UTC m=+52.547819492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.024920 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.024968 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.024978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.025001 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.025012 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.046315 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:11Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.052131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.052162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.052172 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.052189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.052201 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.072400 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:11Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.076296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.076354 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.076372 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.076400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.076417 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.094760 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:11Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.098979 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.099042 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.099060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.099083 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.099100 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.117189 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:11Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.123487 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.123730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.123741 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.123762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.123779 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.143016 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:11Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.143135 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.145504 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.145542 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.145554 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.145576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.145589 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.151042 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.151086 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.151272 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:11 crc kubenswrapper[4907]: E1009 19:29:11.151420 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.249022 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.249076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.249090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.249113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.249126 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.352292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.352329 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.352338 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.352356 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.352369 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.456376 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.456414 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.456424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.456440 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.456450 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.559820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.560132 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.560260 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.560404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.560541 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.663367 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.663416 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.663428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.663448 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.663461 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.766705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.766773 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.766792 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.766820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.766837 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.869226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.869263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.869276 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.869294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.869307 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.973320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.973399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.973423 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.973451 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:11 crc kubenswrapper[4907]: I1009 19:29:11.973504 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:11Z","lastTransitionTime":"2025-10-09T19:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.077672 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.078137 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.078295 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.078561 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.078731 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.150851 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.150855 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:12 crc kubenswrapper[4907]: E1009 19:29:12.151060 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:12 crc kubenswrapper[4907]: E1009 19:29:12.151183 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.182912 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.182976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.182994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.183017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.183032 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.286349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.286422 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.286443 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.286501 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.286523 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.389986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.390074 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.390099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.390139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.390169 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.493453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.493539 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.493557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.493580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.493595 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.533552 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:12 crc kubenswrapper[4907]: E1009 19:29:12.533766 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:12 crc kubenswrapper[4907]: E1009 19:29:12.533850 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs podName:06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:16.533828428 +0000 UTC m=+42.065795927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs") pod "network-metrics-daemon-sbjsv" (UID: "06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.596622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.596673 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.596682 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.596700 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.596712 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.699347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.699398 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.699409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.699428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.699440 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.803282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.803349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.803368 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.803396 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.803419 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.907405 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.907485 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.907501 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.907523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:12 crc kubenswrapper[4907]: I1009 19:29:12.907535 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:12Z","lastTransitionTime":"2025-10-09T19:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.011244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.011328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.011354 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.011390 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.011416 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.114737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.114817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.114840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.114868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.114888 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.151786 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.151928 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:13 crc kubenswrapper[4907]: E1009 19:29:13.152280 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:13 crc kubenswrapper[4907]: E1009 19:29:13.151957 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.217544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.217604 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.217619 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.217646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.217664 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.320726 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.320787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.320800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.320818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.320832 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.423508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.423559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.423573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.423596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.423607 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.526685 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.526725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.526736 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.526755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.526770 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.630064 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.630173 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.630217 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.630249 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.630268 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.733859 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.733944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.733974 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.734013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.734040 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.837809 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.837890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.837913 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.837944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.837968 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.941975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.942048 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.942072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.942104 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:13 crc kubenswrapper[4907]: I1009 19:29:13.942128 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:13Z","lastTransitionTime":"2025-10-09T19:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.045690 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.045797 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.045822 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.045857 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.045880 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.150041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.150516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.150817 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.150818 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.150858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: E1009 19:29:14.151047 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.151184 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.151223 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: E1009 19:29:14.151428 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.255263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.255317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.255328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.255348 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.255358 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.289746 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.310873 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.331969 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.353236 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.358566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.358853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.358941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.359056 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.359142 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.368110 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.385406 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.410248 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.430539 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.449280 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.462797 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.462870 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.462889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.462920 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.462940 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.468309 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.490323 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.505695 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.528297 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.543119 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.561608 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.566166 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.566227 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.566250 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.566278 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.566300 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.580253 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.593908 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:14Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.669841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.669921 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.669955 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.669993 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.670019 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.772771 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.772832 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.772842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.772862 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.772872 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.875586 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.875689 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.875702 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.875740 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.875754 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.978695 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.978739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.978751 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.978770 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:14 crc kubenswrapper[4907]: I1009 19:29:14.978783 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:14Z","lastTransitionTime":"2025-10-09T19:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.082963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.083061 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.083085 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.083125 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.083162 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.151686 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:15 crc kubenswrapper[4907]: E1009 19:29:15.151951 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.157235 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:15 crc kubenswrapper[4907]: E1009 19:29:15.157422 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.179033 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.187247 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.187307 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.187324 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.187349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.187362 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.201674 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.220291 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.233057 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.246640 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.264948 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.287813 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.290512 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.290556 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.290575 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.290601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.290617 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.303205 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.317615 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.333442 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.347438 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.360276 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.374051 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.393311 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.393490 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.393662 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.393679 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.393700 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.393716 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.407191 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.421751 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.497174 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.497234 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.497252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.497275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.497295 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.601188 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.601545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.601661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.601749 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.601807 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.705588 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.705717 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.705741 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.705789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.705809 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.808894 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.808948 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.808965 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.808990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.809007 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.913527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.913601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.913612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.913630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:15 crc kubenswrapper[4907]: I1009 19:29:15.913642 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:15Z","lastTransitionTime":"2025-10-09T19:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.016893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.017410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.017637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.017844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.017978 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.121857 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.122309 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.122442 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.122647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.122798 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.151278 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:16 crc kubenswrapper[4907]: E1009 19:29:16.151518 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.151302 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:16 crc kubenswrapper[4907]: E1009 19:29:16.151670 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.226191 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.226675 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.226908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.227060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.227196 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.331882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.331954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.331978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.332005 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.332026 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.435522 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.435595 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.435614 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.435641 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.435658 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.539448 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.539552 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.539570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.539600 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.539621 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.581846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:16 crc kubenswrapper[4907]: E1009 19:29:16.582159 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:16 crc kubenswrapper[4907]: E1009 19:29:16.582261 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs podName:06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:24.582231524 +0000 UTC m=+50.114199043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs") pod "network-metrics-daemon-sbjsv" (UID: "06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.643722 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.643781 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.643800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.643827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.643851 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.747540 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.747616 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.747637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.747676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.747702 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.851149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.851236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.851261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.851304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.851327 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.955696 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.955762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.955778 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.955800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:16 crc kubenswrapper[4907]: I1009 19:29:16.955820 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:16Z","lastTransitionTime":"2025-10-09T19:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.059115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.059196 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.059221 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.059256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.059282 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.151717 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.151729 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:17 crc kubenswrapper[4907]: E1009 19:29:17.151965 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:17 crc kubenswrapper[4907]: E1009 19:29:17.152158 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.161682 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.161750 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.161768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.161793 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.161813 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.264198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.264236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.264245 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.264261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.264273 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.367094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.367137 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.367150 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.367171 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.367188 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.469746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.469818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.469830 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.469848 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.469859 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.572930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.572986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.573000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.573022 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.573035 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.676172 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.676225 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.676236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.676259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.676274 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.778878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.778942 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.778959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.778985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.779013 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.886722 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.886795 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.886816 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.886845 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.886870 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.990920 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.991013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.991037 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.991073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:17 crc kubenswrapper[4907]: I1009 19:29:17.991103 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:17Z","lastTransitionTime":"2025-10-09T19:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.094616 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.094711 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.094724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.094747 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.094796 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.151512 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.151610 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:18 crc kubenswrapper[4907]: E1009 19:29:18.151720 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:18 crc kubenswrapper[4907]: E1009 19:29:18.151891 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.197972 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.198019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.198036 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.198062 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.198076 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.305617 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.305654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.305662 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.305681 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.305693 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.408254 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.408305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.408317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.408336 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.408346 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.510514 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.510559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.510569 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.510585 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.510596 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.613058 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.613100 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.613112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.613132 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.613143 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.715966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.716007 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.716018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.716035 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.716046 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.818398 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.818449 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.818484 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.818508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.818526 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.920942 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.920982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.920992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.921029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:18 crc kubenswrapper[4907]: I1009 19:29:18.921040 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:18Z","lastTransitionTime":"2025-10-09T19:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.023387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.023428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.023438 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.023456 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.023480 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.126428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.126459 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.126494 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.126513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.126528 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.151104 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.151243 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:19 crc kubenswrapper[4907]: E1009 19:29:19.151444 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:19 crc kubenswrapper[4907]: E1009 19:29:19.151598 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.229493 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.229538 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.229550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.229574 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.229587 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.332919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.333161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.333268 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.333338 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.333394 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.436308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.436582 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.436669 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.436813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.436905 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.540854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.540906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.540924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.540946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.540963 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.643881 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.643918 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.643926 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.643942 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.643954 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.747070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.747297 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.747361 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.747483 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.747550 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.851173 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.851459 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.851557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.851629 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.851709 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.958782 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.958831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.958842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.958858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:19 crc kubenswrapper[4907]: I1009 19:29:19.958869 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:19Z","lastTransitionTime":"2025-10-09T19:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.062900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.062957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.062968 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.062986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.062998 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.150896 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.150951 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:20 crc kubenswrapper[4907]: E1009 19:29:20.151041 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:20 crc kubenswrapper[4907]: E1009 19:29:20.151200 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.166355 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.166391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.166404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.166422 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.166434 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.269103 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.269427 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.269546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.269733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.269821 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.373041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.373082 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.373090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.373105 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.373114 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.475810 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.476047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.476173 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.476245 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.476308 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.579118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.579166 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.579178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.579197 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.579210 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.682168 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.682427 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.682523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.682613 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.682694 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.785746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.785801 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.785812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.785832 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.785847 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.888179 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.888296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.888409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.888497 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.888515 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.992239 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.992304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.992324 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.992351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:20 crc kubenswrapper[4907]: I1009 19:29:20.992369 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:20Z","lastTransitionTime":"2025-10-09T19:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.095657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.095721 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.095739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.095758 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.095774 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.151126 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.151181 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:21 crc kubenswrapper[4907]: E1009 19:29:21.151378 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:21 crc kubenswrapper[4907]: E1009 19:29:21.151446 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.199885 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.199946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.199965 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.199992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.200012 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.304300 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.304379 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.304401 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.304435 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.304457 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.407148 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.407206 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.407223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.407247 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.407265 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.502266 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.502365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.502396 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.502429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.502453 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: E1009 19:29:21.516988 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:21Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.521990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.522046 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.522065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.522094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.522114 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: E1009 19:29:21.541125 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:21Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.546667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.547291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.547366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.547440 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.547527 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: E1009 19:29:21.561666 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:21Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.567337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.567500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.567684 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.567794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.567889 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: E1009 19:29:21.583924 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:21Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.589677 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.589752 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.589774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.589804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.589824 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: E1009 19:29:21.611865 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:21Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:21 crc kubenswrapper[4907]: E1009 19:29:21.612568 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.615347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.615575 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.615802 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.615983 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.616140 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.721429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.721554 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.721575 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.721609 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.721628 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.761652 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.763048 4907 scope.go:117] "RemoveContainer" containerID="51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.827886 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.827966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.827988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.828019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.828039 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.932033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.932789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.932820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.932861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:21 crc kubenswrapper[4907]: I1009 19:29:21.932889 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:21Z","lastTransitionTime":"2025-10-09T19:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.035894 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.035966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.035992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.036029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.036052 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.139705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.139769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.139780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.139804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.139818 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.150649 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.150767 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:22 crc kubenswrapper[4907]: E1009 19:29:22.150961 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:22 crc kubenswrapper[4907]: E1009 19:29:22.151150 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.242480 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.242517 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.242527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.242543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.242552 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.345738 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.345783 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.345793 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.345817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.345829 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.449243 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.449287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.449297 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.449315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.449326 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.503715 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/1.log" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.507430 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.508084 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.530893 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.552009 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.552060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.552071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.552092 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.552103 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.562993 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.576397 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.590794 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.602222 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.614842 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.627282 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.643237 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.655343 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.655394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.655408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.655429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.655443 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.663878 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.679161 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.693514 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.712522 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.729693 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.744320 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.757603 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.757663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.757676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.757701 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.757716 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.758506 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.778188 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:22Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.861500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.861547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.861558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.861575 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.861585 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.964061 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.964099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.964110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.964126 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:22 crc kubenswrapper[4907]: I1009 19:29:22.964136 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:22Z","lastTransitionTime":"2025-10-09T19:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.065899 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.065934 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.065943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.065958 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.065969 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.150993 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.151061 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:23 crc kubenswrapper[4907]: E1009 19:29:23.151134 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:23 crc kubenswrapper[4907]: E1009 19:29:23.151197 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.167760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.167843 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.167871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.167905 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.167930 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.270729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.270806 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.270828 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.270860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.270880 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.372919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.372952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.372961 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.372976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.372986 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.475525 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.475556 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.475567 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.475583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.475593 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.512652 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/2.log" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.513396 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/1.log" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.516102 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a" exitCode=1 Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.516139 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.516209 4907 scope.go:117] "RemoveContainer" containerID="51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.517633 4907 scope.go:117] "RemoveContainer" containerID="884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a" Oct 09 19:29:23 crc kubenswrapper[4907]: E1009 19:29:23.517922 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.534417 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.549426 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.566961 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.578549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.578581 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.578593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.578611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.578624 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.594454 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51761030ee0987c6aefe6baa2984d21a794fb95170381f74e6be13f04e36bd8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:06Z\\\",\\\"message\\\":\\\"ate operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-multus/multus-admission-controller]} name:Service_openshift-multus/multus-admission-controller_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:06.362357 6337 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-z8tzv\\\\nF1009 19:29:06.362357 6337 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling w\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.615234 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.636672 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.654574 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.673662 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.682739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.682812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.682832 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.682859 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.682881 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.690999 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.705626 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.721223 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.741139 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.761751 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.777388 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.786596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.786630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.786639 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.786653 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.786664 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.797204 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.815145 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.889394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.889452 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.889481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.889503 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.889516 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.992773 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.992825 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.992839 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.992864 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:23 crc kubenswrapper[4907]: I1009 19:29:23.992878 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:23Z","lastTransitionTime":"2025-10-09T19:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.096957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.097008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.097023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.097049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.097064 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.150870 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:24 crc kubenswrapper[4907]: E1009 19:29:24.151018 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.151217 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:24 crc kubenswrapper[4907]: E1009 19:29:24.151543 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.200946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.200988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.200999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.201015 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.201024 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.305543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.305620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.305634 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.305657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.305672 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.410003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.410069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.410087 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.410114 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.410132 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.513438 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.513559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.513580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.513609 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.513630 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.523671 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/2.log" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.531373 4907 scope.go:117] "RemoveContainer" containerID="884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a" Oct 09 19:29:24 crc kubenswrapper[4907]: E1009 19:29:24.531587 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.556320 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.580370 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.604907 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.618001 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.618086 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.618115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.618154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.618180 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.627686 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.647161 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.665903 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.682865 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:24 crc kubenswrapper[4907]: E1009 19:29:24.683054 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:24 crc kubenswrapper[4907]: E1009 19:29:24.683125 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs podName:06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:40.683108386 +0000 UTC m=+66.215075885 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs") pod "network-metrics-daemon-sbjsv" (UID: "06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.691267 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.715312 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.721566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.721625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.721646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.721676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.721696 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.733113 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.751821 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.771311 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.789987 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.809276 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.824646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.825211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.825505 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.825727 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.825945 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.825130 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.843863 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.861738 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:24Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.929814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.929882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.929908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.929938 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:24 crc kubenswrapper[4907]: I1009 19:29:24.929964 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:24Z","lastTransitionTime":"2025-10-09T19:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.033788 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.033875 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.033901 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.033945 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.033973 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.138580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.138709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.138737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.138773 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.138797 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.151134 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.151196 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:25 crc kubenswrapper[4907]: E1009 19:29:25.151332 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:25 crc kubenswrapper[4907]: E1009 19:29:25.151523 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.175671 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.195517 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.218922 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.242060 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.244755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.244860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.244889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.244927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.244956 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.261740 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.284419 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.303571 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.319592 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.346497 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.349222 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.349294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.349320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.349357 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.349385 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.381398 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.398335 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.417360 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.444761 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.464596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.464637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.464647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.464665 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.464677 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.483521 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.502047 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.518574 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:25Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.566302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.566337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.566370 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.566384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.566392 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.670299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.670384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.670404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.670434 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.670457 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.773874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.773919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.773931 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.773948 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.773959 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.877036 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.877116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.877135 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.877172 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.877196 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.981299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.981364 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.981382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.981407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:25 crc kubenswrapper[4907]: I1009 19:29:25.981422 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:25Z","lastTransitionTime":"2025-10-09T19:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.085113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.085170 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.085181 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.085199 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.085211 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.151545 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.151620 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:26 crc kubenswrapper[4907]: E1009 19:29:26.151721 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:26 crc kubenswrapper[4907]: E1009 19:29:26.151938 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.187924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.187968 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.187980 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.187999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.188010 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.291121 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.291176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.291189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.291208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.291224 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.394493 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.395658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.395774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.395861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.395938 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.500110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.500163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.500963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.501002 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.501017 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.604168 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.604744 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.605663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.605737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.605764 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.708633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.708680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.708693 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.708711 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.708724 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.812397 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.812501 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.812522 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.812558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.812577 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.916059 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.916103 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.916116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.916136 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:26 crc kubenswrapper[4907]: I1009 19:29:26.916147 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:26Z","lastTransitionTime":"2025-10-09T19:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.019629 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.019725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.019746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.019776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.019801 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.111679 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.111998 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.112102 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112173 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:29:59.112090273 +0000 UTC m=+84.644057802 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.112328 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112378 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.112448 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112448 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112582 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112615 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112633 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112644 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112663 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112674 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112532 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:59.112459352 +0000 UTC m=+84.644426891 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112939 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:59.112895093 +0000 UTC m=+84.644862802 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.112993 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:59.112973925 +0000 UTC m=+84.644941664 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.113034 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:29:59.113014686 +0000 UTC m=+84.644982455 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.123081 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.123165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.123196 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.123241 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.123273 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.150461 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.150604 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.150672 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:27 crc kubenswrapper[4907]: E1009 19:29:27.150778 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.226654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.227714 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.227757 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.227782 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.227800 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.331384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.331490 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.331505 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.331535 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.331550 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.434337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.434375 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.434386 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.434402 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.434413 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.538602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.539000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.539091 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.539178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.539264 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.642787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.642857 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.642878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.642909 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.642935 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.747769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.748368 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.748573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.748731 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.748881 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.852380 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.852454 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.852513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.852545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.852561 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.956031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.956105 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.956128 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.956157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:27 crc kubenswrapper[4907]: I1009 19:29:27.956175 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:27Z","lastTransitionTime":"2025-10-09T19:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.059246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.059336 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.059362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.059392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.059415 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.150987 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.151085 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:28 crc kubenswrapper[4907]: E1009 19:29:28.151243 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:28 crc kubenswrapper[4907]: E1009 19:29:28.151392 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.162367 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.162422 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.162435 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.162455 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.162486 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.265703 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.265736 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.265747 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.265760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.265770 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.367449 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.367889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.367903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.367924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.367938 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.471003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.471067 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.471082 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.471107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.471123 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.574670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.574724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.574737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.574756 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.574768 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.678651 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.678777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.678802 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.678836 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.678855 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.782513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.782951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.783124 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.783296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.783567 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.887262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.887349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.887369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.887400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.887420 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.992730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.992809 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.992828 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.992856 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:28 crc kubenswrapper[4907]: I1009 19:29:28.992875 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:28Z","lastTransitionTime":"2025-10-09T19:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.097109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.097694 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.097862 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.098090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.098221 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.150750 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.150953 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:29 crc kubenswrapper[4907]: E1009 19:29:29.151291 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:29 crc kubenswrapper[4907]: E1009 19:29:29.151445 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.202463 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.202561 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.202584 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.202612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.202633 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.306044 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.306080 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.306088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.306102 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.306111 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.410061 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.410119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.410140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.410166 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.410186 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.513844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.513909 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.513930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.513958 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.513980 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.616754 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.616819 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.616840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.616872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.616895 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.719809 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.719871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.719882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.719902 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.719911 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.823622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.823687 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.823699 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.823717 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.823747 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.927341 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.927402 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.927416 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.927443 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:29 crc kubenswrapper[4907]: I1009 19:29:29.927457 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:29Z","lastTransitionTime":"2025-10-09T19:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.031049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.031109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.031123 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.031146 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.031161 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.133457 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.133544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.133557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.133573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.133583 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.150976 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.151060 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:30 crc kubenswrapper[4907]: E1009 19:29:30.151204 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:30 crc kubenswrapper[4907]: E1009 19:29:30.151374 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.237094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.237729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.237762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.237790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.237805 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.297888 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.318007 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.332554 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.340224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.340286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.340311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.340345 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.340376 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.355753 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.376715 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.392558 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.412766 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.442229 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.443936 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.443970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.443980 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.444016 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.444029 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.459546 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.479557 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.499165 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.516010 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.533027 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.547330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.547378 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.547389 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.547407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.547419 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.561569 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.585404 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.601566 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.612585 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.629722 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:30Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.651315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.651388 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.651407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.651436 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.651455 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.754287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.754875 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.755051 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.755219 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.755341 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.857988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.858040 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.858050 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.858071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.858084 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.960804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.960916 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.960930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.960949 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:30 crc kubenswrapper[4907]: I1009 19:29:30.960962 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:30Z","lastTransitionTime":"2025-10-09T19:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.063541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.063617 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.063631 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.063657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.063673 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.150448 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:31 crc kubenswrapper[4907]: E1009 19:29:31.150624 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.150703 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:31 crc kubenswrapper[4907]: E1009 19:29:31.151067 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.166520 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.166556 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.166566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.166589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.166600 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.269770 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.269837 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.269854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.269882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.269902 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.373223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.374730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.374911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.375065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.375207 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.478248 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.478307 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.478326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.478354 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.478374 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.581695 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.581733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.581743 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.581761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.581771 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.685349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.685645 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.685778 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.685874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.685961 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.789566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.789880 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.790007 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.790133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.790242 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.870397 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.870441 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.870453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.870500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.870519 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: E1009 19:29:31.888058 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:31Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.893698 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.893827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.893849 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.893872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.893890 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: E1009 19:29:31.911492 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:31Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.916244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.916272 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.916282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.916300 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.916309 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: E1009 19:29:31.937024 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:31Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.942957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.943000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.943016 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.943047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.943064 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: E1009 19:29:31.961857 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:31Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.975115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.975164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.975181 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.975206 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.975222 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:31 crc kubenswrapper[4907]: E1009 19:29:31.997717 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:31Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:31 crc kubenswrapper[4907]: E1009 19:29:31.997951 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.999800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.999831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:31 crc kubenswrapper[4907]: I1009 19:29:31.999845 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:31.999868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:31.999883 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:31Z","lastTransitionTime":"2025-10-09T19:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.103173 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.103241 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.103263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.103294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.103321 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.151196 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.151380 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:32 crc kubenswrapper[4907]: E1009 19:29:32.151496 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:32 crc kubenswrapper[4907]: E1009 19:29:32.151693 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.206887 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.206938 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.206974 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.206998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.207012 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.309538 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.309593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.309611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.309639 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.309659 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.411810 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.411864 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.411876 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.411896 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.411910 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.514328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.514397 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.514419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.514450 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.514740 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.617481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.617536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.617545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.617562 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.617577 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.719612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.719656 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.719665 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.719680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.719690 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.822109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.822152 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.822162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.822179 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.822191 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.924641 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.924673 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.924699 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.924712 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:32 crc kubenswrapper[4907]: I1009 19:29:32.924721 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:32Z","lastTransitionTime":"2025-10-09T19:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.026823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.026863 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.026871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.026887 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.026896 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.129642 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.129882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.129943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.130065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.130130 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.151151 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.151176 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:33 crc kubenswrapper[4907]: E1009 19:29:33.151262 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:33 crc kubenswrapper[4907]: E1009 19:29:33.151309 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.233028 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.233072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.233086 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.233108 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.233123 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.335637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.335733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.335753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.336293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.336366 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.439784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.439844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.439861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.439890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.439915 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.543976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.544017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.544033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.544051 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.544062 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.647218 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.647325 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.647345 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.647376 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.647396 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.750768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.750814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.750823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.750840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.750851 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.854786 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.854851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.854864 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.854881 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.854892 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.958822 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.958907 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.958928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.958959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:33 crc kubenswrapper[4907]: I1009 19:29:33.958983 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:33Z","lastTransitionTime":"2025-10-09T19:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.062294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.062358 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.062370 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.062385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.062395 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.151241 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.151283 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:34 crc kubenswrapper[4907]: E1009 19:29:34.151688 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:34 crc kubenswrapper[4907]: E1009 19:29:34.151810 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.164833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.164875 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.164889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.164906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.164922 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.267287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.267331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.267340 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.267356 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.267364 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.374822 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.374930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.374954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.374994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.375013 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.478243 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.478345 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.478372 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.478407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.478433 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.582030 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.582106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.582128 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.582158 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.582186 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.684742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.684779 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.684787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.684800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.684811 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.787100 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.787141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.787149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.787165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.787173 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.889133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.889199 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.889209 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.889223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.889233 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.992495 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.992541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.992551 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.992568 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:34 crc kubenswrapper[4907]: I1009 19:29:34.992578 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:34Z","lastTransitionTime":"2025-10-09T19:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.097231 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.097265 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.097273 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.097291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.097303 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.150672 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:35 crc kubenswrapper[4907]: E1009 19:29:35.150801 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.150895 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:35 crc kubenswrapper[4907]: E1009 19:29:35.150996 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.173956 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.188647 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.201712 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.201769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.201779 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.201793 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.201805 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.204754 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.216549 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.225852 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.238230 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.253369 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.271553 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.283936 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.293675 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.303713 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.304282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.304317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.304325 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.304341 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.304351 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.313487 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.325922 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.336358 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.346323 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.355309 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.366226 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:35Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.406662 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.406853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.406951 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.407041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.407120 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.509599 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.509662 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.509672 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.509687 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.509696 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.612343 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.612403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.612417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.612435 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.612447 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.714736 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.714768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.714777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.714791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.714800 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.817315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.817355 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.817365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.817380 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.817400 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.922337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.923118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.923311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.923519 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:35 crc kubenswrapper[4907]: I1009 19:29:35.923685 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:35Z","lastTransitionTime":"2025-10-09T19:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.026296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.026334 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.026344 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.026362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.026372 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.129092 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.129413 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.129504 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.129589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.129658 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.151429 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:36 crc kubenswrapper[4907]: E1009 19:29:36.151589 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.151611 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:36 crc kubenswrapper[4907]: E1009 19:29:36.151773 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.231410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.231444 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.231453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.231493 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.231502 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.334615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.334659 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.334668 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.334684 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.334693 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.437883 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.437937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.437949 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.437972 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.437984 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.541738 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.541818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.541838 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.541866 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.541885 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.646098 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.646169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.646187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.646213 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.646230 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.749451 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.749550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.749571 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.749600 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.749624 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.852994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.853052 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.853065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.853081 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.853090 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.955847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.955876 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.955888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.955904 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:36 crc kubenswrapper[4907]: I1009 19:29:36.955914 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:36Z","lastTransitionTime":"2025-10-09T19:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.058300 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.058373 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.058390 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.058407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.058441 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.151545 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.151679 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:37 crc kubenswrapper[4907]: E1009 19:29:37.151702 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:37 crc kubenswrapper[4907]: E1009 19:29:37.151908 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.153353 4907 scope.go:117] "RemoveContainer" containerID="884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a" Oct 09 19:29:37 crc kubenswrapper[4907]: E1009 19:29:37.153751 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.161704 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.161737 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.161745 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.161760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.161771 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.265385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.265532 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.265566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.265626 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.265657 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.369249 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.369291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.369304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.369324 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.369337 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.472430 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.472570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.472589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.472838 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.472871 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.575901 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.575967 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.575995 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.576030 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.576058 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.679434 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.679519 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.679528 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.679549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.679559 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.783160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.783226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.783234 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.783253 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.783267 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.887290 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.887366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.887384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.887408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.887428 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.990653 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.990714 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.990732 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.990756 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:37 crc kubenswrapper[4907]: I1009 19:29:37.990774 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:37Z","lastTransitionTime":"2025-10-09T19:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.093117 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.093213 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.093243 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.093277 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.093305 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.150697 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.150697 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:38 crc kubenswrapper[4907]: E1009 19:29:38.150915 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:38 crc kubenswrapper[4907]: E1009 19:29:38.151046 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.196786 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.196871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.196898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.196935 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.196956 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.300027 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.300079 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.300097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.300118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.300133 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.404145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.404215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.404233 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.404259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.404275 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.507119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.507186 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.507201 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.507258 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.507323 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.610682 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.610755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.610793 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.610830 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.610850 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.714217 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.714282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.714298 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.714319 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.714336 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.817229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.817326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.817346 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.817378 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.817399 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.920909 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.920977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.920988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.921023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:38 crc kubenswrapper[4907]: I1009 19:29:38.921038 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:38Z","lastTransitionTime":"2025-10-09T19:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.024097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.024142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.024150 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.024165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.024175 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.126573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.126642 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.126661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.126690 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.126707 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.150994 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.151078 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:39 crc kubenswrapper[4907]: E1009 19:29:39.151254 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:39 crc kubenswrapper[4907]: E1009 19:29:39.151400 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.229807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.229862 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.229874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.229891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.229901 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.333209 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.333354 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.333373 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.333398 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.333413 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.436328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.436375 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.436386 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.436407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.436421 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.539397 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.539458 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.539507 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.539531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.539549 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.642784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.642834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.642844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.642860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.642870 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.746763 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.746854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.746877 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.746905 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.746924 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.850225 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.850293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.850308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.850337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.850358 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.954941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.955027 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.955047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.955075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:39 crc kubenswrapper[4907]: I1009 19:29:39.955095 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:39Z","lastTransitionTime":"2025-10-09T19:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.059061 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.059137 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.059158 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.059187 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.059206 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.159632 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:40 crc kubenswrapper[4907]: E1009 19:29:40.160321 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.160652 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:40 crc kubenswrapper[4907]: E1009 19:29:40.160874 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.163312 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.163404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.163424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.163453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.163500 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.266417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.266489 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.266500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.266521 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.266532 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.369657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.369707 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.369725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.369749 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.369770 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.472671 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.472747 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.472762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.472791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.472808 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.579982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.580053 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.580068 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.580094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.580108 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.683076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.683158 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.683198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.683236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.683318 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.774028 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:40 crc kubenswrapper[4907]: E1009 19:29:40.774324 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:40 crc kubenswrapper[4907]: E1009 19:29:40.774525 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs podName:06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:12.774485662 +0000 UTC m=+98.306453171 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs") pod "network-metrics-daemon-sbjsv" (UID: "06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.786622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.786697 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.786712 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.786743 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.786762 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.890807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.890873 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.890886 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.890906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.890920 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.994287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.994330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.994339 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.994355 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:40 crc kubenswrapper[4907]: I1009 19:29:40.994368 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:40Z","lastTransitionTime":"2025-10-09T19:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.097487 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.097533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.097546 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.097567 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.097583 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.150886 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.150987 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:41 crc kubenswrapper[4907]: E1009 19:29:41.151154 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:41 crc kubenswrapper[4907]: E1009 19:29:41.151312 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.200287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.200349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.200365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.200398 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.200417 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.303659 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.303721 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.303733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.303753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.303766 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.407045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.407108 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.407120 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.407142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.407157 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.509891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.509959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.509973 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.509996 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.510012 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.612564 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.612611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.612624 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.612642 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.612655 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.716109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.716160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.716171 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.716189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.716200 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.819949 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.820021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.820039 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.820069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.820092 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.923200 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.923242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.923251 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.923267 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:41 crc kubenswrapper[4907]: I1009 19:29:41.923278 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:41Z","lastTransitionTime":"2025-10-09T19:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.026150 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.026217 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.026235 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.026261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.026281 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.130397 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.130524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.130554 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.130592 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.130617 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.144161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.144225 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.144252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.144300 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.144321 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.151022 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.151121 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:42 crc kubenswrapper[4907]: E1009 19:29:42.151342 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:42 crc kubenswrapper[4907]: E1009 19:29:42.151623 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:42 crc kubenswrapper[4907]: E1009 19:29:42.160231 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:42Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.165483 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.165526 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.165536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.165555 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.165569 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: E1009 19:29:42.180733 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:42Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.186881 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.186970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.186997 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.187038 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.187063 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: E1009 19:29:42.210244 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:42Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.215271 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.215318 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.215330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.215353 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.215367 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: E1009 19:29:42.228626 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:42Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.233235 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.233286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.233301 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.233320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.233331 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: E1009 19:29:42.246236 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:42Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:42 crc kubenswrapper[4907]: E1009 19:29:42.246394 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.248315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.248366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.248377 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.248392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.248417 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.351199 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.351360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.351390 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.351430 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.351455 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.455095 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.455179 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.455203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.455235 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.455256 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.558119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.558207 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.558231 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.558268 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.558291 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.662349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.662428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.662443 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.662482 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.662495 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.765907 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.765990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.766012 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.766042 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.766066 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.868739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.868794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.868806 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.868825 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.868837 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.972292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.972343 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.972358 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.972379 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:42 crc kubenswrapper[4907]: I1009 19:29:42.972396 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:42Z","lastTransitionTime":"2025-10-09T19:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.076025 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.076401 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.076564 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.076671 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.076765 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.150612 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.150612 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:43 crc kubenswrapper[4907]: E1009 19:29:43.150881 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:43 crc kubenswrapper[4907]: E1009 19:29:43.151494 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.180918 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.180983 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.181001 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.181028 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.181044 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.284277 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.284401 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.284424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.284453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.284498 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.387969 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.388050 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.388078 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.388115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.388135 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.491514 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.491568 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.491583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.491600 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.491612 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.594949 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.595019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.595033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.595053 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.595065 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.613442 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/0.log" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.613571 4907 generic.go:334] "Generic (PLEG): container finished" podID="64344fcc-f9f2-424f-a32b-44927641b614" containerID="4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22" exitCode=1 Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.613622 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hns2h" event={"ID":"64344fcc-f9f2-424f-a32b-44927641b614","Type":"ContainerDied","Data":"4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.614238 4907 scope.go:117] "RemoveContainer" containerID="4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.631102 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.652527 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.669907 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.692189 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.708623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.708663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.708680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.708700 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.708716 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.716785 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.733739 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.761597 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.783402 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.801169 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.811665 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.811921 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.812033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.812237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.812699 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.819233 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.836228 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.860907 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.881664 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.897924 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.912690 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.916887 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.916937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.916948 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.916970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.916985 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:43Z","lastTransitionTime":"2025-10-09T19:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.926914 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:43 crc kubenswrapper[4907]: I1009 19:29:43.939005 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:43Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.019894 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.019946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.019963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.019983 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.019998 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.123026 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.123619 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.123827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.123989 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.124124 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.150998 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.151101 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:44 crc kubenswrapper[4907]: E1009 19:29:44.151404 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:44 crc kubenswrapper[4907]: E1009 19:29:44.151632 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.227531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.227611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.227643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.227675 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.227700 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.330212 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.330285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.330300 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.330324 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.330341 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.433486 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.433544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.433558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.433576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.433587 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.537582 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.537657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.537677 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.537708 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.537731 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.620858 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/0.log" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.620945 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hns2h" event={"ID":"64344fcc-f9f2-424f-a32b-44927641b614","Type":"ContainerStarted","Data":"40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.643968 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.644823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.644889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.644905 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.644927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.644941 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.661551 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.685148 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.702148 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.719603 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.734364 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.747657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.747851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.747895 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.747920 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.747993 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.748250 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.760732 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.771655 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.790213 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.808480 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.827662 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.849564 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.851957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.851993 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.852006 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.852023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.852037 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.863838 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.881363 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.896610 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.909218 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:44Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.955070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.955119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.955137 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.955162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:44 crc kubenswrapper[4907]: I1009 19:29:44.955181 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:44Z","lastTransitionTime":"2025-10-09T19:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.058386 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.058509 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.058531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.058557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.058578 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.151063 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.151205 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:45 crc kubenswrapper[4907]: E1009 19:29:45.151276 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:45 crc kubenswrapper[4907]: E1009 19:29:45.151499 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.162160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.162232 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.162258 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.162287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.162312 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.177087 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.194385 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.216138 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.233123 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.245823 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.259976 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.267763 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.267823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.267841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.267867 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.267885 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.275724 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.290815 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.304622 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.323576 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.338788 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.353869 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.371876 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.371950 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.371978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.372010 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.372034 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.390670 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.409042 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.424028 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.443134 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.460453 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:45Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.474657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.474718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.474728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.474745 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.474757 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.582534 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.582627 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.582655 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.582688 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.582725 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.686453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.686586 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.686614 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.686651 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.686680 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.790138 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.790212 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.790232 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.790259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.790280 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.893153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.893403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.893566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.893638 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.893735 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.996172 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.996228 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.996242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.996264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:45 crc kubenswrapper[4907]: I1009 19:29:45.996291 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:45Z","lastTransitionTime":"2025-10-09T19:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.100124 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.100448 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.100554 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.100631 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.100703 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.151517 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.151557 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:46 crc kubenswrapper[4907]: E1009 19:29:46.151972 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:46 crc kubenswrapper[4907]: E1009 19:29:46.152192 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.204058 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.204308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.204409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.204499 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.204561 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.307784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.307851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.307876 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.307911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.307935 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.411357 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.411944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.412138 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.412301 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.412555 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.515641 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.515687 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.515709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.515736 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.515755 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.619120 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.619185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.619211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.619244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.619267 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.722573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.722606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.722615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.722630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.722641 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.824625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.824667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.824676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.824692 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.824701 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.927126 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.927186 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.927215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.927244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:46 crc kubenswrapper[4907]: I1009 19:29:46.927259 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:46Z","lastTransitionTime":"2025-10-09T19:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.030349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.030397 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.030410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.030428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.030441 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.132330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.132396 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.132408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.132430 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.132449 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.150610 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.150771 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:47 crc kubenswrapper[4907]: E1009 19:29:47.150954 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:47 crc kubenswrapper[4907]: E1009 19:29:47.151176 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.235151 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.235195 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.235205 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.235222 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.235235 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.338149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.338195 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.338206 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.338225 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.338236 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.440869 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.440931 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.440946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.440972 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.440990 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.544523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.544577 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.544597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.544623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.544643 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.647764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.647844 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.647864 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.647894 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.647916 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.750347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.750437 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.750496 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.750536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.750559 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.854162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.854228 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.854247 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.854275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.854297 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.958121 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.958203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.958230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.958263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:47 crc kubenswrapper[4907]: I1009 19:29:47.958283 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:47Z","lastTransitionTime":"2025-10-09T19:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.061906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.061978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.061998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.062024 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.062044 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.151458 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.151534 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:48 crc kubenswrapper[4907]: E1009 19:29:48.151734 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:48 crc kubenswrapper[4907]: E1009 19:29:48.151894 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.164954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.165012 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.165026 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.165047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.165061 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.267812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.267963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.267978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.267996 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.268008 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.370776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.370822 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.370831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.370848 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.370860 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.473302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.473395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.473424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.473516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.473550 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.576741 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.576800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.576813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.576833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.576862 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.680124 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.680194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.680215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.680252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.680279 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.783432 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.783539 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.783560 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.783587 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.783606 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.886217 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.886246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.886255 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.886268 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.886277 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.989443 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.989513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.989528 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.989554 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:48 crc kubenswrapper[4907]: I1009 19:29:48.989568 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:48Z","lastTransitionTime":"2025-10-09T19:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.091743 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.091812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.091827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.091849 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.091862 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.150946 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.151026 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:49 crc kubenswrapper[4907]: E1009 19:29:49.151108 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:49 crc kubenswrapper[4907]: E1009 19:29:49.151195 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.194051 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.194090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.194099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.194116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.194133 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.296536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.296580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.296589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.296603 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.296613 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.399008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.399044 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.399053 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.399072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.399083 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.502639 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.502698 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.502707 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.502727 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.502738 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.605482 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.605523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.605532 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.605547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.605557 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.708352 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.708397 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.708407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.708423 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.708432 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.811699 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.811803 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.811819 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.811838 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.811849 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.914405 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.914457 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.914482 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.914499 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:49 crc kubenswrapper[4907]: I1009 19:29:49.914513 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:49Z","lastTransitionTime":"2025-10-09T19:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.017988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.018032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.018041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.018058 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.018068 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.121966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.122041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.122067 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.122102 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.122128 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.150674 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:50 crc kubenswrapper[4907]: E1009 19:29:50.150866 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.151035 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:50 crc kubenswrapper[4907]: E1009 19:29:50.151356 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.226235 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.226280 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.226291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.226306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.226316 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.334716 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.334750 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.334762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.334781 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.334791 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.438079 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.438137 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.438153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.438176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.438193 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.541407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.541502 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.541524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.541553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.541573 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.644998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.645065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.645081 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.645109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.645128 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.748006 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.748049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.748060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.748078 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.748089 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.850615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.850664 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.850676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.850696 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.850710 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.953458 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.953522 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.953536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.953553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:50 crc kubenswrapper[4907]: I1009 19:29:50.953566 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:50Z","lastTransitionTime":"2025-10-09T19:29:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.057014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.057094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.057112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.057144 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.057172 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.151277 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.151277 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:51 crc kubenswrapper[4907]: E1009 19:29:51.151718 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:51 crc kubenswrapper[4907]: E1009 19:29:51.151931 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.163897 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.163939 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.163947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.163960 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.163969 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.266458 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.266570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.266617 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.266651 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.266674 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.369615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.369663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.369680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.369701 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.369714 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.472923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.472965 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.472975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.472990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.473001 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.575358 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.575390 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.575400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.575416 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.575425 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.678201 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.678247 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.678257 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.678275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.678336 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.781568 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.781620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.781633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.781648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.781660 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.884447 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.884533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.884548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.884571 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.884587 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.988315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.988373 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.988387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.988409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:51 crc kubenswrapper[4907]: I1009 19:29:51.988423 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:51Z","lastTransitionTime":"2025-10-09T19:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.092664 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.092760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.092796 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.092827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.092853 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.151109 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:52 crc kubenswrapper[4907]: E1009 19:29:52.151657 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.151711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:52 crc kubenswrapper[4907]: E1009 19:29:52.152533 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.153137 4907 scope.go:117] "RemoveContainer" containerID="884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.196798 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.196862 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.196884 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.196916 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.196942 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.300335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.300403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.300429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.300495 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.300523 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.403345 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.403389 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.403402 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.403422 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.403438 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.508234 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.508281 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.508291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.508308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.508317 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.526149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.526197 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.526215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.526241 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.526252 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: E1009 19:29:52.544825 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.550417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.550527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.550548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.550576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.550594 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: E1009 19:29:52.566002 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.571287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.571329 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.571344 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.571366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.571382 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: E1009 19:29:52.593117 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.600673 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.600719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.600730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.600749 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.600762 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: E1009 19:29:52.621642 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.626620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.626705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.626728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.626762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.626787 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: E1009 19:29:52.645934 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: E1009 19:29:52.646174 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.648395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.648459 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.648485 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.648508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.648525 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.656212 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/2.log" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.659384 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.660321 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.681211 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.696502 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.711726 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.725011 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.737866 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.751809 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.751888 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.751920 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.751960 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.751987 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.752081 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.768866 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.781713 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.802682 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.836937 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.851344 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.854243 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.854316 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.854335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.854358 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.854376 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.868312 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.884268 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.899046 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.913520 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.925062 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.938202 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:52Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.957043 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.957081 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.957093 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.957130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:52 crc kubenswrapper[4907]: I1009 19:29:52.957154 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:52Z","lastTransitionTime":"2025-10-09T19:29:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.059726 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.059777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.059787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.059807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.059822 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.150707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.150887 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:53 crc kubenswrapper[4907]: E1009 19:29:53.151056 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:53 crc kubenswrapper[4907]: E1009 19:29:53.151364 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.163005 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.163057 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.163070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.163083 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.163097 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.266730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.266816 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.266833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.266861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.266885 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.370423 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.370566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.370602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.370632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.370656 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.474559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.474644 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.474669 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.474703 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.474727 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.578405 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.578524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.578544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.578583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.578604 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.667727 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/3.log" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.669081 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/2.log" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.678785 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" exitCode=1 Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.678865 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.679043 4907 scope.go:117] "RemoveContainer" containerID="884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.679866 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:29:53 crc kubenswrapper[4907]: E1009 19:29:53.680182 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.681263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.681359 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.681456 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.681520 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.681548 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.707265 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.723039 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.742544 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.759212 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.775319 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.784625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.784660 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.784683 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.784700 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.784710 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.798699 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.832065 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://884a87bdc45dd5d5775d94a5820e19f49df5e2e925198322faee1c83c778962a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:22Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 19:29:22.709180 6556 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 19:29:22.709211 6556 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 19:29:22.709235 6556 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 19:29:22.709286 6556 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 19:29:22.709307 6556 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 19:29:22.709379 6556 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 19:29:22.709288 6556 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 19:29:22.709384 6556 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1009 19:29:22.709448 6556 factory.go:656] Stopping watch factory\\\\nI1009 19:29:22.709484 6556 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 19:29:22.709524 6556 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:22.709417 6556 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 19:29:22.709522 6556 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 19:29:22.709594 6556 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 19:29:22.709698 6556 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:53Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:53.049522 6913 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 19:29:53.049577 6913 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:53.049619 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 19:29:53.049633 6913 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI1009 19:29:53.049661 6913 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.150464ms\\\\nI1009 19:29:53.049680 6913 services_controller.go:356] Processing sync for service openshift-dns/dns-default for network=default\\\\nF1009 19:29:53.049749 6913 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.849709 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.872864 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.888171 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.888256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.888279 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.888313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.888337 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.894055 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.913727 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.933763 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.955102 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.976772 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.991076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.991132 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.991153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.991185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.991208 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:53Z","lastTransitionTime":"2025-10-09T19:29:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:53 crc kubenswrapper[4907]: I1009 19:29:53.996565 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:53Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.012533 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.028326 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.094781 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.094873 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.094893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.094921 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.094945 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.151025 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.151101 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:54 crc kubenswrapper[4907]: E1009 19:29:54.151281 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:54 crc kubenswrapper[4907]: E1009 19:29:54.151436 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.198399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.198451 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.198486 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.198506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.198524 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.301780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.301826 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.301854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.301871 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.301882 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.405323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.405426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.405449 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.405520 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.405558 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.509158 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.509226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.509236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.509258 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.509269 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.612751 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.612847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.612889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.612927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.612950 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.685635 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/3.log" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.691113 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:29:54 crc kubenswrapper[4907]: E1009 19:29:54.691328 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.707921 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.717075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.717155 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.717178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.717208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.717228 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.722030 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.740289 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.766917 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:53Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:53.049522 6913 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 19:29:53.049577 6913 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:53.049619 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 19:29:53.049633 6913 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI1009 19:29:53.049661 6913 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.150464ms\\\\nI1009 19:29:53.049680 6913 services_controller.go:356] Processing sync for service openshift-dns/dns-default for network=default\\\\nF1009 19:29:53.049749 6913 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.785815 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.807419 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.821455 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.821544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.821563 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.821590 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.821606 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.825507 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.841018 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.858839 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.875626 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.894162 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.912664 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.927070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.927243 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.927330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.927453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.927569 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:54Z","lastTransitionTime":"2025-10-09T19:29:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.937858 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.959453 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.979377 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:54 crc kubenswrapper[4907]: I1009 19:29:54.995568 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:54Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.012033 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.030777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.030837 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.030859 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.030889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.030913 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.133766 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.134067 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.134085 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.134105 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.134116 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.150759 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:55 crc kubenswrapper[4907]: E1009 19:29:55.150908 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.151080 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:55 crc kubenswrapper[4907]: E1009 19:29:55.151373 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.171358 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.189759 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.210827 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.227324 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.237879 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.237918 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.237927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.237948 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.237962 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.249329 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.267431 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.287590 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.308815 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.335722 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:53Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:53.049522 6913 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 19:29:53.049577 6913 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:53.049619 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 19:29:53.049633 6913 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI1009 19:29:53.049661 6913 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.150464ms\\\\nI1009 19:29:53.049680 6913 services_controller.go:356] Processing sync for service openshift-dns/dns-default for network=default\\\\nF1009 19:29:53.049749 6913 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.341657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.341736 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.341766 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.341804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.341827 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.357343 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.379917 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.395101 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.412411 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.430809 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.447893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.447961 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.447980 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.448007 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.448026 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.448606 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.468799 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.489618 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:29:55Z is after 2025-08-24T17:21:41Z" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.552507 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.552583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.552612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.552644 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.552664 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.656755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.656834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.656861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.656899 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.656925 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.762403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.762540 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.762563 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.762592 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.762614 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.865073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.865119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.865129 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.865146 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.865158 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.969094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.969160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.969175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.969198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:55 crc kubenswrapper[4907]: I1009 19:29:55.969213 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:55Z","lastTransitionTime":"2025-10-09T19:29:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.072138 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.072209 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.072226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.072251 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.072271 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.151113 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.151201 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:56 crc kubenswrapper[4907]: E1009 19:29:56.151303 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:56 crc kubenswrapper[4907]: E1009 19:29:56.151414 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.175243 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.175305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.175323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.175351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.175370 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.278377 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.278428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.278437 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.278455 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.278482 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.382238 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.382316 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.382335 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.382368 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.382388 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.486208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.486304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.486326 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.486362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.486387 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.588988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.589054 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.589078 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.589106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.589130 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.692761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.692807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.692823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.692840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.692851 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.797151 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.797218 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.797239 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.797265 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.797285 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.900363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.900443 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.900504 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.900541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:56 crc kubenswrapper[4907]: I1009 19:29:56.900565 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:56Z","lastTransitionTime":"2025-10-09T19:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.004516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.004589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.004612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.004638 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.004656 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.107281 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.107322 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.107331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.107346 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.107355 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.151313 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.151349 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:57 crc kubenswrapper[4907]: E1009 19:29:57.151574 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:57 crc kubenswrapper[4907]: E1009 19:29:57.151861 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.210343 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.210411 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.210430 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.210456 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.210538 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.313728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.313826 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.313851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.313885 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.313912 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.417597 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.417658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.417678 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.417702 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.417720 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.522294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.522561 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.522601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.522635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.522660 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.626281 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.626338 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.626350 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.626372 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.626386 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.729523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.729609 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.729638 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.729671 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.729695 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.833439 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.833544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.833566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.833602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.833621 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.937153 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.937194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.937202 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.937218 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:57 crc kubenswrapper[4907]: I1009 19:29:57.937227 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:57Z","lastTransitionTime":"2025-10-09T19:29:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.041112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.041165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.041176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.041195 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.041206 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.144076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.144140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.144151 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.144166 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.144176 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.150712 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.150715 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:58 crc kubenswrapper[4907]: E1009 19:29:58.150834 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:29:58 crc kubenswrapper[4907]: E1009 19:29:58.151341 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.248113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.248229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.248259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.248291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.248312 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.351729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.351821 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.351846 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.351921 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.351946 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.455174 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.455257 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.455278 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.455307 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.455327 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.559084 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.559173 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.559197 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.559228 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.559255 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.662403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.662449 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.662475 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.662493 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.662504 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.766431 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.766584 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.766615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.766648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.766668 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.871194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.871248 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.871261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.871284 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.871297 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.974497 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.974580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.974599 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.974630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:58 crc kubenswrapper[4907]: I1009 19:29:58.974652 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:58Z","lastTransitionTime":"2025-10-09T19:29:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.083419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.084116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.084136 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.084165 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.084192 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.151764 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.151831 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.152190 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.152292 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.173532 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.187013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.187068 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.187088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.187130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.187149 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.206420 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.206588 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.206610 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.206581821 +0000 UTC m=+148.738549340 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.206653 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.206718 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.206788 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.206817 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.206884 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.206855808 +0000 UTC m=+148.738823337 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.206986 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207016 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207051 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207078 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207097 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207114 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207123 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207081 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.207059943 +0000 UTC m=+148.739027472 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207234 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.207208757 +0000 UTC m=+148.739176276 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:59 crc kubenswrapper[4907]: E1009 19:29:59.207286 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.207267988 +0000 UTC m=+148.739235517 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.290202 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.290260 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.290278 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.290304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.290324 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.393357 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.393426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.393446 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.393514 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.393539 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.497649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.497690 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.497702 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.497722 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.497736 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.601318 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.601409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.601433 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.601503 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.601528 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.705360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.705403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.705413 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.705429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.705440 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.808846 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.808893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.808901 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.808918 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.808929 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.911756 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.911813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.911827 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.911849 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:29:59 crc kubenswrapper[4907]: I1009 19:29:59.911863 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:29:59Z","lastTransitionTime":"2025-10-09T19:29:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.015387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.015424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.015433 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.015448 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.015457 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.118999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.119060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.119076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.119097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.119118 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.150988 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.151108 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:00 crc kubenswrapper[4907]: E1009 19:30:00.152214 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:00 crc kubenswrapper[4907]: E1009 19:30:00.152462 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.221263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.221684 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.221796 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.221872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.221945 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.323807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.323843 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.323852 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.323869 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.323881 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.425877 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.425912 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.425923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.425943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.425955 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.529586 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.529641 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.529652 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.529670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.529681 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.633264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.633316 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.633332 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.633356 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.633377 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.736477 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.736544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.736558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.736579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.736596 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.839826 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.839860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.839868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.839883 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.839892 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.943386 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.943431 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.943441 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.943456 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:00 crc kubenswrapper[4907]: I1009 19:30:00.943487 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:00Z","lastTransitionTime":"2025-10-09T19:30:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.046733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.047107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.047195 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.047289 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.047369 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.150441 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.150611 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:01 crc kubenswrapper[4907]: E1009 19:30:01.150718 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.150858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.150911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.150923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.150947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: E1009 19:30:01.150979 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.151051 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.255045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.255104 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.255117 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.255141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.255156 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.358338 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.358394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.358420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.358441 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.358454 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.460719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.460788 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.460798 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.460813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.460822 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.564181 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.564226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.564236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.564254 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.564265 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.668293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.668368 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.668393 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.668419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.668432 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.771769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.771814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.771823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.771843 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.771852 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.875065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.875107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.875116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.875129 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.875141 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.978249 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.978316 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.978336 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.978363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:01 crc kubenswrapper[4907]: I1009 19:30:01.978380 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:01Z","lastTransitionTime":"2025-10-09T19:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.082375 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.082433 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.082448 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.082543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.082564 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.150980 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.150980 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:02 crc kubenswrapper[4907]: E1009 19:30:02.151131 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:02 crc kubenswrapper[4907]: E1009 19:30:02.151194 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.186022 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.186067 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.186078 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.186098 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.186108 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.289523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.289615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.289643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.289683 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.289713 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.393151 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.393228 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.393247 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.393277 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.393297 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.495646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.495693 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.495705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.495725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.495737 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.598552 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.598616 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.598632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.598658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.598670 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.658602 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.658665 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.658684 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.658706 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.658720 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: E1009 19:30:02.679240 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.684846 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.684883 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.684897 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.684916 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.684933 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: E1009 19:30:02.699088 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.703506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.703586 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.703614 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.703654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.703684 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: E1009 19:30:02.719431 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.726742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.726858 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.726875 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.726896 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.726917 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: E1009 19:30:02.743226 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.747681 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.747724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.747734 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.747751 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.747762 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: E1009 19:30:02.762934 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:02Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:02 crc kubenswrapper[4907]: E1009 19:30:02.763094 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.764923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.764966 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.764980 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.764999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.765010 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.867239 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.867273 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.867283 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.867296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.867305 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.969429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.969484 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.969497 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.969516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:02 crc kubenswrapper[4907]: I1009 19:30:02.969526 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:02Z","lastTransitionTime":"2025-10-09T19:30:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.071913 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.071948 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.071957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.071971 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.071982 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.150913 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.150984 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:03 crc kubenswrapper[4907]: E1009 19:30:03.151048 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:03 crc kubenswrapper[4907]: E1009 19:30:03.151104 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.175394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.175447 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.175460 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.175508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.175519 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.277988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.278025 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.278034 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.278048 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.278058 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.380048 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.380086 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.380094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.380109 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.380119 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.483303 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.483376 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.483394 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.483419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.483436 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.586941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.587017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.587032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.587053 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.587065 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.689868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.689900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.689908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.689922 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.689932 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.793539 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.793621 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.793641 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.793670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.793693 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.897272 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.897318 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.897328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.897349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:03 crc kubenswrapper[4907]: I1009 19:30:03.897359 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:03Z","lastTransitionTime":"2025-10-09T19:30:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.000307 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.000356 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.000366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.000384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.000394 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.103231 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.103284 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.103293 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.103306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.103315 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.151194 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.151239 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:04 crc kubenswrapper[4907]: E1009 19:30:04.151349 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:04 crc kubenswrapper[4907]: E1009 19:30:04.151517 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.205111 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.205151 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.205160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.205175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.205187 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.307413 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.307454 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.307479 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.307493 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.307504 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.409185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.409216 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.409226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.409241 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.409252 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.512037 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.512085 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.512098 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.512115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.512128 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.615230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.615302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.615315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.615341 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.615354 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.717418 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.717499 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.717517 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.717537 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.717551 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.825070 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.825121 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.825131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.825146 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.825157 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.928739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.929025 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.929089 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.929175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:04 crc kubenswrapper[4907]: I1009 19:30:04.929244 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:04Z","lastTransitionTime":"2025-10-09T19:30:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.032607 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.032693 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.032713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.032744 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.032766 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.136440 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.136509 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.136520 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.136540 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.136551 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.151306 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:05 crc kubenswrapper[4907]: E1009 19:30:05.151502 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.151589 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:05 crc kubenswrapper[4907]: E1009 19:30:05.151919 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.167266 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.190868 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.204916 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.218080 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.231489 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.238997 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.239138 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.239163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.239194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.239214 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.247200 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.261620 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0c8130-447e-4836-9358-17db808233d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff791c39ddb973fb41489e48e41803fadf855cf25423f47501c62fbe002cac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ff05209d16fb966d03bb41c5943bc7cff7a444bde6e7f126f9ff1d6479854a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3ff05209d16fb966d03bb41c5943bc7cff7a444bde6e7f126f9ff1d6479854a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.292159 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.313914 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.336506 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.342819 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.342877 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.342898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.342927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.342946 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.353936 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.370203 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.388892 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.402404 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.424798 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.445753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.445821 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.445834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.445851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.445863 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.455625 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.481713 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:53Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:53.049522 6913 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 19:29:53.049577 6913 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:53.049619 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 19:29:53.049633 6913 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI1009 19:29:53.049661 6913 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.150464ms\\\\nI1009 19:29:53.049680 6913 services_controller.go:356] Processing sync for service openshift-dns/dns-default for network=default\\\\nF1009 19:29:53.049749 6913 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.496824 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:05Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.549214 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.549260 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.549305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.549323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.549333 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.652108 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.652157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.652167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.652182 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.652190 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.754297 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.754604 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.755019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.755104 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.755137 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.857722 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.857778 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.857789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.857808 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.857820 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.960055 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.960112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.960125 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.960144 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:05 crc kubenswrapper[4907]: I1009 19:30:05.960158 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:05Z","lastTransitionTime":"2025-10-09T19:30:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.063547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.063615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.063636 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.063663 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.063687 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.150697 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:06 crc kubenswrapper[4907]: E1009 19:30:06.150865 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.150976 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:06 crc kubenswrapper[4907]: E1009 19:30:06.151278 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.166376 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.166408 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.166421 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.166439 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.166454 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.172689 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.269161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.269195 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.269204 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.269217 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.269228 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.371761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.371807 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.371820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.371840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.371853 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.476367 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.476457 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.476523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.476557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.476581 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.579948 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.580004 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.580021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.580047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.580067 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.682724 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.682767 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.682776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.682791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.682803 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.785683 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.785749 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.785762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.785777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.785787 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.888820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.888903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.888918 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.888937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.888949 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.992098 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.992182 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.992200 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.992219 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:06 crc kubenswrapper[4907]: I1009 19:30:06.992232 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:06Z","lastTransitionTime":"2025-10-09T19:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.095900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.095974 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.095989 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.096012 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.096026 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.151237 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:07 crc kubenswrapper[4907]: E1009 19:30:07.151688 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.151940 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:07 crc kubenswrapper[4907]: E1009 19:30:07.152076 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.200115 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.200194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.200215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.200244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.200263 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.303398 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.303524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.303551 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.303585 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.303609 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.405801 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.405854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.405869 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.405889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.405905 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.508241 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.508291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.508304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.508321 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.508333 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.611754 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.611864 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.611893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.611923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.611944 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.715350 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.715388 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.715399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.715413 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.715442 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.817762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.817823 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.817841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.817865 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.817885 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.920427 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.920692 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.920763 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.920836 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:07 crc kubenswrapper[4907]: I1009 19:30:07.920905 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:07Z","lastTransitionTime":"2025-10-09T19:30:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.023002 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.023056 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.023072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.023094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.023112 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.126040 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.126088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.126102 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.126121 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.126133 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.151172 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.151174 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:08 crc kubenswrapper[4907]: E1009 19:30:08.151550 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:08 crc kubenswrapper[4907]: E1009 19:30:08.151966 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.229632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.229683 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.229694 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.229710 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.229721 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.332313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.332342 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.332351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.332364 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.332372 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.435618 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.435687 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.435712 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.435741 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.435765 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.538917 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.538984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.539010 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.539031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.539046 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.642742 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.642801 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.642816 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.642840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.642853 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.745978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.746018 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.746042 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.746063 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.746076 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.848999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.849041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.849057 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.849076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.849089 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.951131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.951179 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.951203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.951232 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:08 crc kubenswrapper[4907]: I1009 19:30:08.951247 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:08Z","lastTransitionTime":"2025-10-09T19:30:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.053749 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.053789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.053800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.053817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.053829 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.151257 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.151331 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:09 crc kubenswrapper[4907]: E1009 19:30:09.151394 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:09 crc kubenswrapper[4907]: E1009 19:30:09.151803 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.155595 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.155631 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.155641 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.155654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.155663 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.258377 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.258412 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.258420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.258435 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.258445 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.360815 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.360847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.360856 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.360869 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.360879 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.463863 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.463900 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.463911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.463928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.463939 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.566974 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.567088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.567170 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.567255 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.567288 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.669963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.670011 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.670020 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.670037 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.670047 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.772025 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.772094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.772124 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.772161 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.772180 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.874569 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.874630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.874646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.874664 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.874679 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.977110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.977141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.977149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.977163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:09 crc kubenswrapper[4907]: I1009 19:30:09.977172 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:09Z","lastTransitionTime":"2025-10-09T19:30:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.079374 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.079403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.079411 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.079425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.079434 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.150791 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.150815 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:10 crc kubenswrapper[4907]: E1009 19:30:10.151413 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:10 crc kubenswrapper[4907]: E1009 19:30:10.151515 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.151979 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:30:10 crc kubenswrapper[4907]: E1009 19:30:10.152266 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.183248 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.183282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.183292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.183306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.183314 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.285775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.285877 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.285894 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.285933 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.285954 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.392272 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.392309 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.392321 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.392337 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.392349 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.495420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.495452 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.495460 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.495497 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.495505 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.597392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.597425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.597434 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.597448 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.597457 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.700596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.700644 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.700654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.700675 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.700686 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.803309 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.803346 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.803355 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.803369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.803378 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.905765 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.905808 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.905817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.905831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:10 crc kubenswrapper[4907]: I1009 19:30:10.905842 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:10Z","lastTransitionTime":"2025-10-09T19:30:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.007911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.007943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.007952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.007988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.007998 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.110662 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.110740 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.110764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.110792 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.110814 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.151280 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.151321 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:11 crc kubenswrapper[4907]: E1009 19:30:11.151427 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:11 crc kubenswrapper[4907]: E1009 19:30:11.151541 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.213640 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.213689 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.213703 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.213718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.213730 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.317090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.317132 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.317141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.317158 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.317171 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.420657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.420790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.420814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.420851 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.420874 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.525373 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.525437 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.525461 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.525525 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.525547 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.629361 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.629426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.629449 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.629506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.629528 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.733042 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.733136 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.733163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.733214 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.733243 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.836609 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.836668 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.836686 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.836713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.836731 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.940233 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.940285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.940306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.940331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:11 crc kubenswrapper[4907]: I1009 19:30:11.940352 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:11Z","lastTransitionTime":"2025-10-09T19:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.043685 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.043753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.043777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.043813 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.043837 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.147890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.147953 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.147970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.147996 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.148016 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.151315 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.151548 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.151812 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.152037 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.252149 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.252229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.252254 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.252289 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.252317 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.356635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.356710 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.356729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.356758 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.356776 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.460760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.460843 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.460861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.460889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.460909 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.564450 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.564605 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.564629 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.564664 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.564686 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.669278 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.669384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.669404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.669954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.669992 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.773524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.773583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.773596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.773616 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.773630 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.868273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.868599 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.868721 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs podName:06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:16.86869162 +0000 UTC m=+162.400659139 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs") pod "network-metrics-daemon-sbjsv" (UID: "06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.877677 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.877753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.877776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.877870 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.877897 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.892023 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.892074 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.892087 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.892106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.892119 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.911378 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:12Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.917262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.917298 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.917309 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.917328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.917343 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.931914 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:12Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.936674 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.936718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.936732 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.936749 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.936761 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.953919 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:12Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.959112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.959168 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.959189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.959213 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.959232 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.977849 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:12Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.981878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.981930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.981950 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.981977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.981995 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.996324 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:12Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:12 crc kubenswrapper[4907]: E1009 19:30:12.996517 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.997941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.997969 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.997982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.998001 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:12 crc kubenswrapper[4907]: I1009 19:30:12.998014 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:12Z","lastTransitionTime":"2025-10-09T19:30:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.100878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.100928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.100942 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.100964 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.100978 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.150972 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.151076 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:13 crc kubenswrapper[4907]: E1009 19:30:13.151232 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:13 crc kubenswrapper[4907]: E1009 19:30:13.151400 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.204198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.204270 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.204296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.204330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.204354 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.307430 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.307573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.307606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.307637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.307662 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.411566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.411636 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.411649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.411677 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.411692 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.515107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.515175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.515194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.515222 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.515242 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.618730 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.618812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.618833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.618864 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.618885 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.722837 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.722914 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.722940 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.722973 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.723001 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.826324 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.826404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.826422 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.826454 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.826505 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.930257 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.930331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.930354 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.930387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:13 crc kubenswrapper[4907]: I1009 19:30:13.930409 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:13Z","lastTransitionTime":"2025-10-09T19:30:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.034433 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.034539 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.034560 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.034591 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.034614 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.137903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.137976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.137991 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.138015 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.138030 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.150849 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.150921 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:14 crc kubenswrapper[4907]: E1009 19:30:14.151329 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:14 crc kubenswrapper[4907]: E1009 19:30:14.151627 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.241246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.241321 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.241347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.241382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.241409 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.346353 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.346426 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.346447 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.346510 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.346533 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.450193 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.450259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.450275 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.450302 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.450320 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.553534 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.553596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.553606 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.553620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.553650 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.657230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.657321 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.657349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.657374 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.657394 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.761188 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.761245 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.761256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.761277 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.761289 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.864675 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.864777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.864801 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.864834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.864855 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.969035 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.969106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.969127 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.969157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:14 crc kubenswrapper[4907]: I1009 19:30:14.969176 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:14Z","lastTransitionTime":"2025-10-09T19:30:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.072548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.072618 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.072630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.072651 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.072666 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.151405 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.151447 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:15 crc kubenswrapper[4907]: E1009 19:30:15.151671 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:15 crc kubenswrapper[4907]: E1009 19:30:15.151934 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.175623 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"232fe335-3cd6-4fb1-b335-07fbfe64c940\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5387c1738a4d05c6109a742f6b61676cfbeb715df755904f74ae1e34a4aab5ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f2c23f4e7bf5151120c44939afeb5acd6953a34a22a927c3a26233339663616\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f818a76830d877a1cb53b585c4446f17dae824d6d1be68d8fb7776f1b26584\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17e5b28c03e5a72631b0861b7bcdca9fad9804c1736890f3b3e9cac83e0d1b03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6035609488fc4f082c856e24f81662a8fbc84690972fb8029d6bf1cbdb41d11d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d282bd9d74b834be74be4e26091853c2bb7bcc920d445713ba1b3a077abf9238\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://291c41867b1e62776d78e78fe107a461b0b265209332ed100f93a8071b01f318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:29:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zrt2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8tzv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.177173 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.177292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.177360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.177430 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.177515 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.192919 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41b5cc90-e796-4d1c-b9e8-0d68c2a19e0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b9585fad3278d11e98cf7955b01754436e8d3e001b4ba90ee1777b511ea8fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32b96b0ec5eb7c8177d2e5ae80bb99c21e60a444fb41bd27ba51c024cc9b9fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65218b377f32e0ad7ab82bd14de0b58e5fbb3cc02fea899429481d81ee4418e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8164793ba9fba55e5c3cec10adbb8aad85d01f676d3ac5e8fe8584b0654284b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.213740 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc09dee2359e738a46023d6f6ba3b74ec9493cb7ed4749b726dc46c3718a1caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9e9780c265038f1baacfd23d4f426bd3f27335178136c5709a68b49f9e78a99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.230082 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.245497 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c98e5d7e-5d91-4825-a839-86a88cc66d4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1711626e1cafce5ff64e11d7e6f1f2007a596390d8f4708fb2a3cceda3f31eac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04160b25a296b42998046f3533cfb2b1197baa0de26895e3186c0dddf9769dff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g6cpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ck44r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.262395 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03f2ee5b-88c3-4926-9659-94e1924be69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa9d1c8dd775cc08a2167606d4eee06b015c6c6223f6266dacf8d87999814463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2092ce68d0b062da52b514be4d958a3fbf650f77b93b1a246d1f832018449ebc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7e50bf4fa1d0a9aafac14459741ac669af001988334a482563f0492c382e5cd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab23d2c5229134f0fda022b5625af650591a7c18dfefcbb85e27250c58483335\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63aadeed5c845a481cdc6c8c96072838b842f2853231e0d9dabf695870e2c078\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T19:28:54Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1009 19:28:48.865829 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 19:28:48.867899 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2056989089/tls.crt::/tmp/serving-cert-2056989089/tls.key\\\\\\\"\\\\nI1009 19:28:54.493118 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 19:28:54.497390 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 19:28:54.497429 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 19:28:54.497493 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 19:28:54.497510 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 19:28:54.516358 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1009 19:28:54.516372 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 19:28:54.516399 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516433 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 19:28:54.516439 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 19:28:54.516442 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 19:28:54.516445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 19:28:54.516450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 19:28:54.518774 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://938d6c4f947d7b81e2918800f46f1be66e126b4c4e1bfc0b67c7c2872231abad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ab4e53de8f0c421d1fad10ee80c6c9134253503880d2460024aff487477f15e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.279167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.279209 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.279223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.279242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.279255 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.286873 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:53Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns-operator/metrics]} name:Service_openshift-dns-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.174:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d937b3b3-82c3-4791-9a66-41b9fed53e9d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 19:29:53.049522 6913 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 19:29:53.049577 6913 ovnkube.go:599] Stopped ovnkube\\\\nI1009 19:29:53.049619 6913 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 19:29:53.049633 6913 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/community-operators\\\\\\\"}\\\\nI1009 19:29:53.049661 6913 services_controller.go:360] Finished syncing service community-operators on namespace openshift-marketplace for network=default : 2.150464ms\\\\nI1009 19:29:53.049680 6913 services_controller.go:356] Processing sync for service openshift-dns/dns-default for network=default\\\\nF1009 19:29:53.049749 6913 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:29:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8n28\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t8m7t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.303259 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b17d457c83a26ea9593160f21294291f16819fd6cab8855f1b7a28610a7984cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.319697 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hns2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64344fcc-f9f2-424f-a32b-44927641b614\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T19:29:43Z\\\",\\\"message\\\":\\\"2025-10-09T19:28:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea\\\\n2025-10-09T19:28:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_abde2d44-2ad3-440c-86f2-65bdeb3b4cea to /host/opt/cni/bin/\\\\n2025-10-09T19:28:58Z [verbose] multus-daemon started\\\\n2025-10-09T19:28:58Z [verbose] Readiness Indicator file check\\\\n2025-10-09T19:29:43Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kxdh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hns2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.335664 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"717141fe-c68d-4844-ad99-872d296a6370\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0c944480a086e28fbda42ee23fa21004964767c98316dbd28295825fbf799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5vd7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v2wbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.350208 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxw2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sbjsv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.367566 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48d8e1e9-8bc3-4968-91f9-ca02660947f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3d166a1c942918a0a3e3f9a915e8edda9b35f55084835ebe61e2f4e5b5177e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://682bbbcfbfae96fd576e60b96a750932b63874516ff8bd90cae525069cdde90b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26046b0f66604b6207aa57788b545bf512101d1814a1e4ddd5250018b2433bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8c17a8806f8411b0fc4ad07ed4e0f692d4da0585621b5c926257aca03987c3e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.382565 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.382623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.382632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.382651 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.382663 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.384916 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed0c8130-447e-4836-9358-17db808233d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ff791c39ddb973fb41489e48e41803fadf855cf25423f47501c62fbe002cac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3ff05209d16fb966d03bb41c5943bc7cff7a444bde6e7f126f9ff1d6479854a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3ff05209d16fb966d03bb41c5943bc7cff7a444bde6e7f126f9ff1d6479854a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.413500 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6520e1c0-d7a0-4e20-8e43-8cf8cf45b43c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://765eeb261d0e387fb63824327d530f70bd8c6625791c5f8f4572a9ac2f1b2ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dd7324a85578c127decf2b1c7f641552e50086fc9fa8078ced3d98c5ca7af5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7d4591279766a8da5539891803a3abd39fc8ab0522ff21c570a0d61513b0f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29922ea66d77f8e78a4c5fe940c41f7013ce3ca20128e0ec2967a3b7869c2889\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be43eb572c8b1c69ebf93339bf933ac1fca5d434356f409da42b1439ac566162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e486e0bd3789894e356b8835028d4ecd1bf0848156531f685a241f092b5cd93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e486e0bd3789894e356b8835028d4ecd1bf0848156531f685a241f092b5cd93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54824376d71d50fdc730df981b13b52c689691125f1635d73fae9edb9ca59591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54824376d71d50fdc730df981b13b52c689691125f1635d73fae9edb9ca59591\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:37Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://39852f841ed98f045e3215238f101af52c7ef784c0976684c006f033d378a969\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39852f841ed98f045e3215238f101af52c7ef784c0976684c006f033d378a969\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T19:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T19:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:35Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.432965 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.452809 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cd5bb2968c17332346b54d161a302a408a70a2cb57c5e19f80278d050965fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.471324 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.484760 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.484873 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.484895 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.484924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.484950 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.485068 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2n5kb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bed29d-cec4-4051-98da-e4a5547f1827\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:28:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64359e7e39d988df5511622e805e75e2c6a9431f0ac66658b2bd74824f46b3f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:28:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hg9zk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:28:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2n5kb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.500968 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dslfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eaeec14-bcbe-4871-b6c2-7ebd234c04bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T19:29:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba802efff12e56897628776547ad0ea961222bc6e4b7156ff383aafb049ab8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T19:29:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5c2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T19:29:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dslfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:15Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.587403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.587511 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.587530 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.587560 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.587579 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.691272 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.691363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.691773 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.691826 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.691850 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.794725 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.795263 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.795518 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.795729 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.795913 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.900287 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.900365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.900381 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.900403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:15 crc kubenswrapper[4907]: I1009 19:30:15.900419 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:15Z","lastTransitionTime":"2025-10-09T19:30:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.003931 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.003976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.003985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.004004 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.004014 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.107403 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.107457 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.107507 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.107529 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.107543 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.151151 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.151266 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:16 crc kubenswrapper[4907]: E1009 19:30:16.152054 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:16 crc kubenswrapper[4907]: E1009 19:30:16.152340 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.210037 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.210120 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.210139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.210170 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.210193 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.313711 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.313756 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.313769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.313791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.313803 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.417046 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.417124 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.417142 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.417169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.417191 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.520869 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.520961 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.520990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.521026 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.521053 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.624709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.624797 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.624812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.624842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.624868 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.727615 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.727709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.727733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.727770 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.727790 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.832992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.833072 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.833090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.833146 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.833167 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.936387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.936535 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.936555 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.936586 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:16 crc kubenswrapper[4907]: I1009 19:30:16.936605 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:16Z","lastTransitionTime":"2025-10-09T19:30:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.040101 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.040188 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.040210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.040239 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.040259 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.143876 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.143994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.144019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.144050 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.144072 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.150902 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.151070 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:17 crc kubenswrapper[4907]: E1009 19:30:17.151235 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:17 crc kubenswrapper[4907]: E1009 19:30:17.151348 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.247284 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.247364 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.247389 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.247420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.247443 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.351363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.351436 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.351450 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.351508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.351528 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.456210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.456299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.456330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.456363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.456387 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.559775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.559856 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.559883 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.559916 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.559942 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.663817 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.663906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.663935 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.663967 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.663987 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.768099 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.768176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.768194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.768228 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.768250 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.872230 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.872308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.872328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.872359 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.872396 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.976410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.976537 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.976624 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.976655 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:17 crc kubenswrapper[4907]: I1009 19:30:17.976686 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:17Z","lastTransitionTime":"2025-10-09T19:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.081324 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.081439 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.081540 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.081575 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.081600 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.151622 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:18 crc kubenswrapper[4907]: E1009 19:30:18.151908 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.152165 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:18 crc kubenswrapper[4907]: E1009 19:30:18.152379 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.185528 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.185605 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.185632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.185661 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.185684 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.288882 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.288985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.289008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.289035 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.289055 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.392800 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.392872 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.392891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.392919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.392940 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.496368 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.496424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.496435 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.496453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.496502 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.600890 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.600964 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.600984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.601017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.601038 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.704839 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.704916 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.704934 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.704963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.704982 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.808874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.808949 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.808970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.809000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.809020 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.912524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.912620 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.912640 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.912670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:18 crc kubenswrapper[4907]: I1009 19:30:18.912690 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:18Z","lastTransitionTime":"2025-10-09T19:30:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.016386 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.016510 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.016538 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.016571 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.016597 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.120307 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.120444 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.120533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.120575 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.120600 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.151184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.151184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:19 crc kubenswrapper[4907]: E1009 19:30:19.151955 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:19 crc kubenswrapper[4907]: E1009 19:30:19.152198 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.224202 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.224843 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.225027 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.225207 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.225395 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.330040 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.330594 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.331015 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.331189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.331340 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.434739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.435253 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.435462 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.435762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.435960 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.538904 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.538983 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.539004 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.539032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.539053 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.643197 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.643251 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.643264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.643284 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.643299 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.748929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.749005 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.749024 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.749052 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.749071 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.853400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.853513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.853533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.853565 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.853585 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.957393 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.957509 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.957531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.957560 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:19 crc kubenswrapper[4907]: I1009 19:30:19.957578 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:19Z","lastTransitionTime":"2025-10-09T19:30:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.061519 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.061622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.061631 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.061648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.061689 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.150620 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.150669 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:20 crc kubenswrapper[4907]: E1009 19:30:20.150938 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:20 crc kubenswrapper[4907]: E1009 19:30:20.151161 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.165164 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.165226 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.165242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.165325 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.165346 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.268313 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.268395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.268420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.268457 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.268535 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.372903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.372982 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.373002 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.373031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.373052 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.477388 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.477499 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.477525 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.477555 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.477579 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.582021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.582110 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.582135 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.582170 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.582189 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.686281 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.686834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.687069 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.687423 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.687707 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.790579 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.790699 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.790718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.790748 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.790773 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.893998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.894078 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.894103 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.894137 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.894166 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.997304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.997386 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.997404 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.997432 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:20 crc kubenswrapper[4907]: I1009 19:30:20.997453 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:20Z","lastTransitionTime":"2025-10-09T19:30:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.101141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.102316 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.102630 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.102839 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.103021 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.150859 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.150863 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:21 crc kubenswrapper[4907]: E1009 19:30:21.151164 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:21 crc kubenswrapper[4907]: E1009 19:30:21.151261 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.206435 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.206516 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.206533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.206555 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.206570 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.310112 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.310212 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.310239 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.310276 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.310312 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.413984 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.414051 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.414073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.414101 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.414123 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.517522 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.517604 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.517623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.517654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.517676 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.620034 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.620090 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.620103 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.620119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.620130 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.729599 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.729693 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.729719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.729755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.729782 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.833785 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.833835 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.833848 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.833868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.833883 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.937416 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.937553 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.937569 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.937589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:21 crc kubenswrapper[4907]: I1009 19:30:21.937611 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:21Z","lastTransitionTime":"2025-10-09T19:30:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.041048 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.041102 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.041114 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.041133 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.041144 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.144591 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.144682 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.144713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.144747 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.144775 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.151497 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.151551 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:22 crc kubenswrapper[4907]: E1009 19:30:22.151809 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:22 crc kubenswrapper[4907]: E1009 19:30:22.151951 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.247860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.247938 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.247957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.247985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.248003 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.351898 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.351986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.352010 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.352042 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.352063 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.456318 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.456402 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.456419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.456446 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.456514 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.560028 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.560131 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.560166 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.560204 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.560232 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.664582 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.664646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.664657 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.664677 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.664691 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.767870 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.767956 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.767987 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.768033 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.768061 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.872946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.873323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.873400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.873508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.873602 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.976220 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.976295 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.976325 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.976361 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:22 crc kubenswrapper[4907]: I1009 19:30:22.976387 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:22Z","lastTransitionTime":"2025-10-09T19:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.075351 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.075410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.075425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.075446 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.075460 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: E1009 19:30:23.097345 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.103395 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.103440 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.103455 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.103502 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.103518 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: E1009 19:30:23.123195 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.128526 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.128589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.128609 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.128637 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.128657 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.151200 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.151213 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:23 crc kubenswrapper[4907]: E1009 19:30:23.151390 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:23 crc kubenswrapper[4907]: E1009 19:30:23.152356 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:23 crc kubenswrapper[4907]: E1009 19:30:23.151649 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.157716 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.157766 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.157785 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.157811 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.157833 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: E1009 19:30:23.179217 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.184649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.184744 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.184769 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.184799 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.184823 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: E1009 19:30:23.208884 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T19:30:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"18e2d302-c2fb-4ade-9fd1-bc58926be156\\\",\\\"systemUUID\\\":\\\"de5ae157-82cf-491d-b46e-a75d3a70699d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T19:30:23Z is after 2025-08-24T17:21:41Z" Oct 09 19:30:23 crc kubenswrapper[4907]: E1009 19:30:23.209108 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.211409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.211498 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.211521 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.211548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.211567 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.315601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.315697 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.315716 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.315743 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.315764 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.419310 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.419391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.419410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.419943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.420014 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.523106 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.523211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.523235 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.523270 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.523295 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.627749 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.627855 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.627878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.627913 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.627935 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.731646 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.731722 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.731747 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.731780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.731805 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.834585 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.834696 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.834710 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.834728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.834744 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.938003 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.938054 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.938073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.938100 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:23 crc kubenswrapper[4907]: I1009 19:30:23.938123 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:23Z","lastTransitionTime":"2025-10-09T19:30:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.041064 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.041139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.041163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.041193 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.041214 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.144804 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.144859 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.144868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.144891 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.144901 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.151291 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.151291 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:24 crc kubenswrapper[4907]: E1009 19:30:24.151825 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:24 crc kubenswrapper[4907]: E1009 19:30:24.151865 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.247833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.247905 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.247930 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.247954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.247970 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.351290 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.351367 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.351385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.351415 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.351433 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.455847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.455937 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.455957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.455992 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.456013 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.559971 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.560044 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.560065 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.560094 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.560116 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.663245 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.663327 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.663349 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.663379 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.663403 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.766651 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.766734 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.766754 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.766784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.766811 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.871178 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.871264 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.871289 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.871323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.871348 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.974954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.975024 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.975041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.975061 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:24 crc kubenswrapper[4907]: I1009 19:30:24.975074 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:24Z","lastTransitionTime":"2025-10-09T19:30:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.078715 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.078806 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.078838 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.078874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.078903 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.151220 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.151780 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:25 crc kubenswrapper[4907]: E1009 19:30:25.151767 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:25 crc kubenswrapper[4907]: E1009 19:30:25.152960 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.153529 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:30:25 crc kubenswrapper[4907]: E1009 19:30:25.153862 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t8m7t_openshift-ovn-kubernetes(85e063f4-3eb6-4502-bf2a-b7e8b0dd7631)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.182824 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.182881 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.182895 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.182918 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.182939 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.214125 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.214088994 podStartE2EDuration="1m30.214088994s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.213748886 +0000 UTC m=+110.745716415" watchObservedRunningTime="2025-10-09 19:30:25.214088994 +0000 UTC m=+110.746056523" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.284899 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.284946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.284959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.284978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.284994 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.304375 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hns2h" podStartSLOduration=91.30435219 podStartE2EDuration="1m31.30435219s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.303109929 +0000 UTC m=+110.835077448" watchObservedRunningTime="2025-10-09 19:30:25.30435219 +0000 UTC m=+110.836319689" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.336457 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podStartSLOduration=91.336434928 podStartE2EDuration="1m31.336434928s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.317550948 +0000 UTC m=+110.849518527" watchObservedRunningTime="2025-10-09 19:30:25.336434928 +0000 UTC m=+110.868402427" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.372587 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2n5kb" podStartSLOduration=91.372351231 podStartE2EDuration="1m31.372351231s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.354328603 +0000 UTC m=+110.886296122" watchObservedRunningTime="2025-10-09 19:30:25.372351231 +0000 UTC m=+110.904318730" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.383043 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dslfr" podStartSLOduration=90.382980486 podStartE2EDuration="1m30.382980486s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.370946456 +0000 UTC m=+110.902913955" watchObservedRunningTime="2025-10-09 19:30:25.382980486 +0000 UTC m=+110.914948005" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.388116 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.388169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.388182 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.388203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.388217 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.399181 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=55.399165178 podStartE2EDuration="55.399165178s" podCreationTimestamp="2025-10-09 19:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.398975884 +0000 UTC m=+110.930943383" watchObservedRunningTime="2025-10-09 19:30:25.399165178 +0000 UTC m=+110.931132687" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.450875 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.450848204 podStartE2EDuration="26.450848204s" podCreationTimestamp="2025-10-09 19:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.413914385 +0000 UTC m=+110.945881894" watchObservedRunningTime="2025-10-09 19:30:25.450848204 +0000 UTC m=+110.982815723" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.451326 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=19.451318206 podStartE2EDuration="19.451318206s" podCreationTimestamp="2025-10-09 19:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.450699201 +0000 UTC m=+110.982666720" watchObservedRunningTime="2025-10-09 19:30:25.451318206 +0000 UTC m=+110.983285725" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.491389 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.491684 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.491756 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.491841 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.491910 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.535109 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=86.53507676 podStartE2EDuration="1m26.53507676s" podCreationTimestamp="2025-10-09 19:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.535065619 +0000 UTC m=+111.067033118" watchObservedRunningTime="2025-10-09 19:30:25.53507676 +0000 UTC m=+111.067044259" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.536046 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z8tzv" podStartSLOduration=91.536039874 podStartE2EDuration="1m31.536039874s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.515549884 +0000 UTC m=+111.047517403" watchObservedRunningTime="2025-10-09 19:30:25.536039874 +0000 UTC m=+111.068007373" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.585910 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ck44r" podStartSLOduration=90.585891154 podStartE2EDuration="1m30.585891154s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:25.58533732 +0000 UTC m=+111.117304839" watchObservedRunningTime="2025-10-09 19:30:25.585891154 +0000 UTC m=+111.117858643" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.596819 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.596855 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.596866 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.596884 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.596898 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.699220 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.699256 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.699268 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.699286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.699297 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.801723 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.801763 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.801771 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.801785 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.801794 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.903913 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.903959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.903970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.903987 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:25 crc kubenswrapper[4907]: I1009 19:30:25.903997 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:25Z","lastTransitionTime":"2025-10-09T19:30:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.007154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.007211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.007224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.007244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.007256 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.110928 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.110985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.111004 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.111030 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.111048 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.150846 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.150938 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:26 crc kubenswrapper[4907]: E1009 19:30:26.151164 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:26 crc kubenswrapper[4907]: E1009 19:30:26.151339 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.214949 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.215006 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.215016 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.215038 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.215050 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.319799 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.319879 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.319903 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.319939 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.319987 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.423113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.423738 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.424071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.424400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.424734 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.528721 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.529191 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.529411 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.529644 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.529853 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.633365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.633508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.633541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.633570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.633590 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.736541 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.736609 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.736624 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.736648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.736665 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.841392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.841522 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.841549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.841586 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.841613 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.945314 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.945410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.945431 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.945493 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:26 crc kubenswrapper[4907]: I1009 19:30:26.945519 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:26Z","lastTransitionTime":"2025-10-09T19:30:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.048385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.048428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.048442 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.048488 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.048505 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.150605 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:27 crc kubenswrapper[4907]: E1009 19:30:27.150796 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.150858 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:27 crc kubenswrapper[4907]: E1009 19:30:27.151105 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.153348 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.153627 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.153792 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.153954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.154099 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.257633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.258042 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.258233 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.258424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.258650 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.363362 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.363420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.363441 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.363502 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.363524 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.467893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.468160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.468192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.468228 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.468253 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.572950 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.573527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.573811 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.574014 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.574173 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.677391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.677452 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.677500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.677530 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.677544 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.780424 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.780532 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.780557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.780589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.780612 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.884060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.884134 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.884155 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.884184 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.884206 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.987840 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.988311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.988387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.988482 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:27 crc kubenswrapper[4907]: I1009 19:30:27.988551 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:27Z","lastTransitionTime":"2025-10-09T19:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.091691 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.092139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.092246 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.092371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.092539 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.150908 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:28 crc kubenswrapper[4907]: E1009 19:30:28.151240 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.151325 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:28 crc kubenswrapper[4907]: E1009 19:30:28.151940 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.195719 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.195762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.195780 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.195815 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.195829 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.298097 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.298137 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.298148 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.298167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.298177 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.400224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.400259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.400270 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.400286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.400297 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.502791 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.502860 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.502879 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.502906 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.502929 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.605371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.605450 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.605500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.605528 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.605553 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.708846 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.709328 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.709589 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.709820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.710034 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.813371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.813503 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.813524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.813556 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.813579 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.917761 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.917835 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.917854 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.917881 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:28 crc kubenswrapper[4907]: I1009 19:30:28.917899 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:28Z","lastTransitionTime":"2025-10-09T19:30:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.021144 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.021210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.021231 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.021260 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.021283 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.125232 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.125320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.125339 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.125366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.125383 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.151007 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.151013 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:29 crc kubenswrapper[4907]: E1009 19:30:29.151254 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:29 crc kubenswrapper[4907]: E1009 19:30:29.151411 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.229192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.229267 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.229286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.229315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.229334 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.333392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.333526 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.333560 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.333591 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.333618 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.436443 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.436533 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.436544 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.436570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.436581 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.540162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.540253 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.540277 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.540315 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.540344 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.643593 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.643643 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.643660 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.643682 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.643699 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.747305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.747391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.747411 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.747461 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.747531 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.833171 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/1.log" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.833920 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/0.log" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.833975 4907 generic.go:334] "Generic (PLEG): container finished" podID="64344fcc-f9f2-424f-a32b-44927641b614" containerID="40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d" exitCode=1 Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.834048 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hns2h" event={"ID":"64344fcc-f9f2-424f-a32b-44927641b614","Type":"ContainerDied","Data":"40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.834197 4907 scope.go:117] "RemoveContainer" containerID="4d3832bd43c04a763c08fc8cf6f2032dd9cb3ce64d0933e1d33fbf46ff9d4c22" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.834666 4907 scope.go:117] "RemoveContainer" containerID="40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d" Oct 09 19:30:29 crc kubenswrapper[4907]: E1009 19:30:29.834878 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-hns2h_openshift-multus(64344fcc-f9f2-424f-a32b-44927641b614)\"" pod="openshift-multus/multus-hns2h" podUID="64344fcc-f9f2-424f-a32b-44927641b614" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.850977 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.851519 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.851547 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.851583 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.851619 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.956323 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.956508 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.956530 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.956596 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:29 crc kubenswrapper[4907]: I1009 19:30:29.956617 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:29Z","lastTransitionTime":"2025-10-09T19:30:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.059768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.059812 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.059821 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.059835 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.059844 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.150663 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.150871 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:30 crc kubenswrapper[4907]: E1009 19:30:30.150881 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:30 crc kubenswrapper[4907]: E1009 19:30:30.151120 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.163656 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.163727 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.163753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.163789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.163815 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.267566 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.267635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.267653 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.267680 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.267700 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.371282 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.371369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.371389 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.371419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.371442 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.475370 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.475428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.475439 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.475487 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.475500 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.579077 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.579141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.579159 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.579189 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.579209 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.682194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.682319 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.682339 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.682369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.682389 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.786141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.786239 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.786266 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.786294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.786316 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.839813 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/1.log" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.889790 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.889878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.889894 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.889919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.889934 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.993309 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.993392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.993410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.993436 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:30 crc kubenswrapper[4907]: I1009 19:30:30.993455 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:30Z","lastTransitionTime":"2025-10-09T19:30:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.097020 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.097076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.097087 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.097105 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.097118 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.151381 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.151412 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:31 crc kubenswrapper[4907]: E1009 19:30:31.151664 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:31 crc kubenswrapper[4907]: E1009 19:30:31.151806 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.201667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.201738 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.201752 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.201776 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.201791 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.305378 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.305446 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.305513 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.305549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.305574 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.409028 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.409098 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.409118 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.409146 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.409166 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.512944 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.513013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.513032 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.513062 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.513084 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.617050 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.617140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.617157 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.617185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.617207 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.720856 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.720957 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.720987 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.721025 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.721051 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.824952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.825019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.825035 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.825060 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.825076 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.928093 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.928163 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.928181 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.928210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:31 crc kubenswrapper[4907]: I1009 19:30:31.928232 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:31Z","lastTransitionTime":"2025-10-09T19:30:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.032322 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.032449 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.032500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.032535 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.032556 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.136573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.136642 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.136662 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.136692 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.136749 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.150943 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.151015 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:32 crc kubenswrapper[4907]: E1009 19:30:32.151184 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:32 crc kubenswrapper[4907]: E1009 19:30:32.151306 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.240641 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.240713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.240733 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.240786 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.240873 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.346190 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.346257 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.346276 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.346300 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.346317 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.450612 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.450670 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.450687 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.450713 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.450728 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.553779 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.553825 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.553833 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.553848 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.553858 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.662341 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.662387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.662400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.662419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.662432 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.766130 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.766175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.766192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.766215 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.766247 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.869175 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.869259 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.869281 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.869317 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.869344 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.972594 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.972648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.972658 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.972678 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:32 crc kubenswrapper[4907]: I1009 19:30:32.972690 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:32Z","lastTransitionTime":"2025-10-09T19:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.076148 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.076192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.076223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.076244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.076257 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:33Z","lastTransitionTime":"2025-10-09T19:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.151013 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.151096 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:33 crc kubenswrapper[4907]: E1009 19:30:33.151226 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:33 crc kubenswrapper[4907]: E1009 19:30:33.151382 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.178941 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.179010 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.179024 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.179045 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.179061 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:33Z","lastTransitionTime":"2025-10-09T19:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.281491 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.281545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.281558 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.281580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.281594 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:33Z","lastTransitionTime":"2025-10-09T19:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.385035 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.385111 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.385139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.385208 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.385238 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:33Z","lastTransitionTime":"2025-10-09T19:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.413379 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.413436 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.413454 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.413517 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.413536 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T19:30:33Z","lastTransitionTime":"2025-10-09T19:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.480829 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r"] Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.481192 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.486725 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.487686 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.487785 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.488392 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.545458 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2de0dcc7-f170-4578-9d28-6c34d168a846-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.545557 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2de0dcc7-f170-4578-9d28-6c34d168a846-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.545598 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2de0dcc7-f170-4578-9d28-6c34d168a846-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.545621 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2de0dcc7-f170-4578-9d28-6c34d168a846-service-ca\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.545639 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2de0dcc7-f170-4578-9d28-6c34d168a846-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.646797 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2de0dcc7-f170-4578-9d28-6c34d168a846-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.646890 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2de0dcc7-f170-4578-9d28-6c34d168a846-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.646938 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2de0dcc7-f170-4578-9d28-6c34d168a846-service-ca\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.646974 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2de0dcc7-f170-4578-9d28-6c34d168a846-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.646977 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2de0dcc7-f170-4578-9d28-6c34d168a846-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.647051 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2de0dcc7-f170-4578-9d28-6c34d168a846-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.647132 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2de0dcc7-f170-4578-9d28-6c34d168a846-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.649363 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2de0dcc7-f170-4578-9d28-6c34d168a846-service-ca\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.656080 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2de0dcc7-f170-4578-9d28-6c34d168a846-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.682277 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2de0dcc7-f170-4578-9d28-6c34d168a846-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-86l9r\" (UID: \"2de0dcc7-f170-4578-9d28-6c34d168a846\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.801209 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" Oct 09 19:30:33 crc kubenswrapper[4907]: I1009 19:30:33.856356 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" event={"ID":"2de0dcc7-f170-4578-9d28-6c34d168a846","Type":"ContainerStarted","Data":"25c01eb0b8e640d168cf84358e78adedd92cabaf128016595dc59d84cb91d70a"} Oct 09 19:30:34 crc kubenswrapper[4907]: I1009 19:30:34.151622 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:34 crc kubenswrapper[4907]: I1009 19:30:34.151656 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:34 crc kubenswrapper[4907]: E1009 19:30:34.151853 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:34 crc kubenswrapper[4907]: E1009 19:30:34.152001 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:34 crc kubenswrapper[4907]: I1009 19:30:34.861654 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" event={"ID":"2de0dcc7-f170-4578-9d28-6c34d168a846","Type":"ContainerStarted","Data":"3e3535f8717facc3ea3d20de008489f048b63ecbcecb13eb50fdf50bf4ff5037"} Oct 09 19:30:34 crc kubenswrapper[4907]: I1009 19:30:34.885266 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-86l9r" podStartSLOduration=100.885235311 podStartE2EDuration="1m40.885235311s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:34.884095683 +0000 UTC m=+120.416063212" watchObservedRunningTime="2025-10-09 19:30:34.885235311 +0000 UTC m=+120.417202830" Oct 09 19:30:35 crc kubenswrapper[4907]: E1009 19:30:35.142125 4907 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 09 19:30:35 crc kubenswrapper[4907]: I1009 19:30:35.150671 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:35 crc kubenswrapper[4907]: I1009 19:30:35.150671 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:35 crc kubenswrapper[4907]: E1009 19:30:35.152841 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:35 crc kubenswrapper[4907]: E1009 19:30:35.152977 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:35 crc kubenswrapper[4907]: E1009 19:30:35.268249 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:30:36 crc kubenswrapper[4907]: I1009 19:30:36.151005 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:36 crc kubenswrapper[4907]: I1009 19:30:36.151104 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:36 crc kubenswrapper[4907]: E1009 19:30:36.151247 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:36 crc kubenswrapper[4907]: E1009 19:30:36.151351 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:37 crc kubenswrapper[4907]: I1009 19:30:37.151585 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:37 crc kubenswrapper[4907]: E1009 19:30:37.151837 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:37 crc kubenswrapper[4907]: I1009 19:30:37.152216 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:37 crc kubenswrapper[4907]: E1009 19:30:37.152348 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:38 crc kubenswrapper[4907]: I1009 19:30:38.150861 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:38 crc kubenswrapper[4907]: E1009 19:30:38.151069 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:38 crc kubenswrapper[4907]: I1009 19:30:38.150878 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:38 crc kubenswrapper[4907]: E1009 19:30:38.151447 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:39 crc kubenswrapper[4907]: I1009 19:30:39.151440 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:39 crc kubenswrapper[4907]: I1009 19:30:39.151574 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:39 crc kubenswrapper[4907]: E1009 19:30:39.151785 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:39 crc kubenswrapper[4907]: E1009 19:30:39.151945 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:39 crc kubenswrapper[4907]: I1009 19:30:39.153057 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:30:39 crc kubenswrapper[4907]: I1009 19:30:39.882299 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/3.log" Oct 09 19:30:39 crc kubenswrapper[4907]: I1009 19:30:39.885206 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerStarted","Data":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} Oct 09 19:30:39 crc kubenswrapper[4907]: I1009 19:30:39.885689 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:30:39 crc kubenswrapper[4907]: I1009 19:30:39.915372 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podStartSLOduration=105.915352159 podStartE2EDuration="1m45.915352159s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:39.914259882 +0000 UTC m=+125.446227381" watchObservedRunningTime="2025-10-09 19:30:39.915352159 +0000 UTC m=+125.447319668" Oct 09 19:30:40 crc kubenswrapper[4907]: I1009 19:30:40.139147 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sbjsv"] Oct 09 19:30:40 crc kubenswrapper[4907]: I1009 19:30:40.139312 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:40 crc kubenswrapper[4907]: E1009 19:30:40.139481 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:40 crc kubenswrapper[4907]: I1009 19:30:40.153850 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:40 crc kubenswrapper[4907]: E1009 19:30:40.154325 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:40 crc kubenswrapper[4907]: I1009 19:30:40.154531 4907 scope.go:117] "RemoveContainer" containerID="40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d" Oct 09 19:30:40 crc kubenswrapper[4907]: E1009 19:30:40.270117 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:30:40 crc kubenswrapper[4907]: I1009 19:30:40.891344 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/1.log" Oct 09 19:30:40 crc kubenswrapper[4907]: I1009 19:30:40.891988 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hns2h" event={"ID":"64344fcc-f9f2-424f-a32b-44927641b614","Type":"ContainerStarted","Data":"9a676382e3b8fb157627fb4d13edff66f1e877f4c38457dd35387965f237f3df"} Oct 09 19:30:41 crc kubenswrapper[4907]: I1009 19:30:41.151496 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:41 crc kubenswrapper[4907]: I1009 19:30:41.151620 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:41 crc kubenswrapper[4907]: E1009 19:30:41.151746 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:41 crc kubenswrapper[4907]: E1009 19:30:41.151853 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:42 crc kubenswrapper[4907]: I1009 19:30:42.151441 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:42 crc kubenswrapper[4907]: I1009 19:30:42.151543 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:42 crc kubenswrapper[4907]: E1009 19:30:42.151806 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:42 crc kubenswrapper[4907]: E1009 19:30:42.151930 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:43 crc kubenswrapper[4907]: I1009 19:30:43.151090 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:43 crc kubenswrapper[4907]: I1009 19:30:43.151097 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:43 crc kubenswrapper[4907]: E1009 19:30:43.151344 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:43 crc kubenswrapper[4907]: E1009 19:30:43.151512 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:44 crc kubenswrapper[4907]: I1009 19:30:44.150622 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:44 crc kubenswrapper[4907]: I1009 19:30:44.150721 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:44 crc kubenswrapper[4907]: E1009 19:30:44.150773 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 19:30:44 crc kubenswrapper[4907]: E1009 19:30:44.150960 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sbjsv" podUID="06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b" Oct 09 19:30:45 crc kubenswrapper[4907]: I1009 19:30:45.150896 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:45 crc kubenswrapper[4907]: E1009 19:30:45.152238 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 19:30:45 crc kubenswrapper[4907]: I1009 19:30:45.152360 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:45 crc kubenswrapper[4907]: E1009 19:30:45.152740 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 19:30:46 crc kubenswrapper[4907]: I1009 19:30:46.150871 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:30:46 crc kubenswrapper[4907]: I1009 19:30:46.150870 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:30:46 crc kubenswrapper[4907]: I1009 19:30:46.154666 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 09 19:30:46 crc kubenswrapper[4907]: I1009 19:30:46.154958 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 09 19:30:46 crc kubenswrapper[4907]: I1009 19:30:46.154998 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 09 19:30:46 crc kubenswrapper[4907]: I1009 19:30:46.156309 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 09 19:30:47 crc kubenswrapper[4907]: I1009 19:30:47.151775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:30:47 crc kubenswrapper[4907]: I1009 19:30:47.151813 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:30:47 crc kubenswrapper[4907]: I1009 19:30:47.155365 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 09 19:30:47 crc kubenswrapper[4907]: I1009 19:30:47.156123 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 09 19:30:51 crc kubenswrapper[4907]: I1009 19:30:51.775829 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.598672 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.650054 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.650621 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nxgdx"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.651010 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.651640 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.651923 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m47fb"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.652294 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.653889 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mzwdh"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.654229 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.655302 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wz5xt"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.656287 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.659567 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.659941 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: W1009 19:30:53.660332 4907 reflector.go:561] object-"openshift-ingress"/"router-certs-default": failed to list *v1.Secret: secrets "router-certs-default" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ingress": no relationship found between node 'crc' and this object Oct 09 19:30:53 crc kubenswrapper[4907]: E1009 19:30:53.660400 4907 reflector.go:158] "Unhandled Error" err="object-\"openshift-ingress\"/\"router-certs-default\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"router-certs-default\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 19:30:53 crc kubenswrapper[4907]: W1009 19:30:53.660508 4907 reflector.go:561] object-"openshift-ingress"/"router-stats-default": failed to list *v1.Secret: secrets "router-stats-default" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ingress": no relationship found between node 'crc' and this object Oct 09 19:30:53 crc kubenswrapper[4907]: W1009 19:30:53.660706 4907 reflector.go:561] object-"openshift-ingress"/"router-metrics-certs-default": failed to list *v1.Secret: secrets "router-metrics-certs-default" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ingress": no relationship found between node 'crc' and this object Oct 09 19:30:53 crc kubenswrapper[4907]: E1009 19:30:53.660747 4907 reflector.go:158] "Unhandled Error" err="object-\"openshift-ingress\"/\"router-metrics-certs-default\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"router-metrics-certs-default\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 19:30:53 crc kubenswrapper[4907]: E1009 19:30:53.660796 4907 reflector.go:158] "Unhandled Error" err="object-\"openshift-ingress\"/\"router-stats-default\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"router-stats-default\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.662509 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-2fnwq"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.663596 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.664276 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.665118 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.665124 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.666121 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.667124 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qs6lp"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.668530 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hztkx"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.668790 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.672531 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.684830 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.685345 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 09 19:30:53 crc kubenswrapper[4907]: W1009 19:30:53.689380 4907 reflector.go:561] object-"openshift-ingress"/"service-ca-bundle": failed to list *v1.ConfigMap: configmaps "service-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ingress": no relationship found between node 'crc' and this object Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.689419 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.690650 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.692107 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 09 19:30:53 crc kubenswrapper[4907]: E1009 19:30:53.689424 4907 reflector.go:158] "Unhandled Error" err="object-\"openshift-ingress\"/\"service-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.694837 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.702798 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.703116 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.703167 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: W1009 19:30:53.703480 4907 reflector.go:561] object-"openshift-ingress"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ingress": no relationship found between node 'crc' and this object Oct 09 19:30:53 crc kubenswrapper[4907]: E1009 19:30:53.703516 4907 reflector.go:158] "Unhandled Error" err="object-\"openshift-ingress\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.705875 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.706135 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.706142 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.706683 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.707530 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.707808 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708033 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708311 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708444 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708629 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708825 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mdh4s"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709724 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708834 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708865 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708953 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.708972 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709024 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709055 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709084 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709080 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709112 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709128 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709225 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709225 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709231 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.710624 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709300 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709370 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709441 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709532 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709537 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709560 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709589 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709595 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709695 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709742 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.709821 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.711181 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.711546 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.711698 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.711830 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.711704 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.712263 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.712939 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.712990 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.713061 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.713147 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.713160 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.713269 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.713331 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.711054 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.718041 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.712452 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.723624 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.724329 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.724970 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.725643 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.726284 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lqqp7"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.727395 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-94nv4"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.727551 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.727850 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.734868 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.735435 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.785454 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.785791 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.786113 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.786403 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.786532 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.766094 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-754vb"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.786852 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.787196 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.787453 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wgct9"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.788158 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.788510 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.788905 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.789249 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-754vb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.789515 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.789796 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.789809 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.789969 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.792063 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.792213 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.793115 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-99plh"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.794103 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.794460 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.794814 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.795030 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.797114 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.797628 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.797918 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.798000 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.798254 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.798395 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.799817 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.799968 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.799812 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.800246 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.806997 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.807297 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.807550 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.807718 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.807827 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808164 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6rv64"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808196 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808288 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808576 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-config\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808679 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808745 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808827 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w62l9\" (UniqueName: \"kubernetes.io/projected/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-kube-api-access-w62l9\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808583 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808616 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809152 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.808909 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/471c7823-e3ea-4b73-9034-3ba5fc123190-serving-cert\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809328 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809354 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hfpt\" (UniqueName: \"kubernetes.io/projected/df82c6a7-c94c-4ed8-9034-45c7515ce78a-kube-api-access-8hfpt\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809442 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swh9v\" (UniqueName: \"kubernetes.io/projected/957d72db-4cb4-4e97-bb11-2f25eb03f259-kube-api-access-swh9v\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809483 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-certificates\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809506 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809524 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-oauth-config\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-tls\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809571 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809590 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-stats-auth\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809673 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809726 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-policies\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809752 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810330 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh"] Oct 09 19:30:53 crc kubenswrapper[4907]: E1009 19:30:53.810557 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.310543742 +0000 UTC m=+139.842511231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.809762 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-client-ca\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810783 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810819 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810845 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810871 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-dir\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810886 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810893 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810919 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-config\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810942 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-ca\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810968 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-service-ca\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810988 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df82c6a7-c94c-4ed8-9034-45c7515ce78a-serving-cert\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811008 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w5k8\" (UniqueName: \"kubernetes.io/projected/4ef0fdd5-8e73-4320-8021-e6f28b26f248-kube-api-access-9w5k8\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811028 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811045 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-config\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811066 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ef0fdd5-8e73-4320-8021-e6f28b26f248-service-ca-bundle\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811083 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811107 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-serving-cert\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811132 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811147 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df82c6a7-c94c-4ed8-9034-45c7515ce78a-trusted-ca\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811167 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-oauth-serving-cert\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811196 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-bound-sa-token\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811230 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811246 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52b58\" (UniqueName: \"kubernetes.io/projected/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-kube-api-access-52b58\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a7cd2efb-c621-4ea7-b1d2-ea923968f737-machine-approver-tls\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811290 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44e3ba1-e673-4377-a492-ee70dfac0406-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811307 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c696fd0-572e-4fcf-bd2b-66cda008888b-serving-cert\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44e3ba1-e673-4377-a492-ee70dfac0406-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811415 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7cd2efb-c621-4ea7-b1d2-ea923968f737-auth-proxy-config\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810738 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811448 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7cd2efb-c621-4ea7-b1d2-ea923968f737-config\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811458 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811498 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3641da-8c41-4082-8d46-a2590ad60dbd-metrics-tls\") pod \"dns-operator-744455d44c-nxgdx\" (UID: \"fd3641da-8c41-4082-8d46-a2590ad60dbd\") " pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811519 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811547 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df82c6a7-c94c-4ed8-9034-45c7515ce78a-config\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811565 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811613 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-client\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811674 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8fjd\" (UniqueName: \"kubernetes.io/projected/a7cd2efb-c621-4ea7-b1d2-ea923968f737-kube-api-access-n8fjd\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811695 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwxk\" (UniqueName: \"kubernetes.io/projected/fd3641da-8c41-4082-8d46-a2590ad60dbd-kube-api-access-kwwxk\") pod \"dns-operator-744455d44c-nxgdx\" (UID: \"fd3641da-8c41-4082-8d46-a2590ad60dbd\") " pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811714 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811741 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgx78\" (UniqueName: \"kubernetes.io/projected/a44e3ba1-e673-4377-a492-ee70dfac0406-kube-api-access-qgx78\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811759 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811777 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811882 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq6x9\" (UniqueName: \"kubernetes.io/projected/5c696fd0-572e-4fcf-bd2b-66cda008888b-kube-api-access-gq6x9\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811902 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-trusted-ca\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811920 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811927 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.811979 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-trusted-ca-bundle\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812036 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8mjk\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-kube-api-access-w8mjk\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812064 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-service-ca\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812082 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kf56\" (UniqueName: \"kubernetes.io/projected/471c7823-e3ea-4b73-9034-3ba5fc123190-kube-api-access-6kf56\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.810846 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812415 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2fnwq"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812434 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nnhgl"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812510 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812760 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812829 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.812947 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.818481 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.818779 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.819045 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.826772 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.828425 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.830336 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.830664 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.830790 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.831069 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.831498 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.833107 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.834407 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.835132 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.835803 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.836935 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.837248 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ntw5v"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.841160 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.845457 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nl9hz"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.847869 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.849002 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.849824 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.854050 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.854081 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.854949 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.856952 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.858627 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wz5xt"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.860071 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m47fb"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.861842 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.863037 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nxgdx"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.865166 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mdh4s"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.867043 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.868630 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-99plh"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.870052 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2hbf2"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.871226 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.871616 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qs6lp"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.873522 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.873825 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.875814 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.879541 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hztkx"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.886448 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.888688 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.893599 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.895356 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wgct9"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.896595 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.898729 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nnhgl"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.901009 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.903108 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-754vb"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.904804 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.907442 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6rv64"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.908854 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.912381 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.912573 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.912602 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w62l9\" (UniqueName: \"kubernetes.io/projected/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-kube-api-access-w62l9\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.912624 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/471c7823-e3ea-4b73-9034-3ba5fc123190-serving-cert\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.912723 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n"] Oct 09 19:30:53 crc kubenswrapper[4907]: E1009 19:30:53.912852 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.412809792 +0000 UTC m=+139.944777281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.912985 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04db531f-adc2-4158-8d64-109774b8115e-serving-cert\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.913047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq8dk\" (UniqueName: \"kubernetes.io/projected/9f6cd201-6a64-4058-9e26-946d60f89c38-kube-api-access-tq8dk\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.913735 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.913848 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914208 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxz4c\" (UniqueName: \"kubernetes.io/projected/86eff4e6-938a-48fa-a116-c46597bc0868-kube-api-access-fxz4c\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914264 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38670a3a-3b58-403a-92e0-f14d5dda51f3-trusted-ca\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914312 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914355 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hfpt\" (UniqueName: \"kubernetes.io/projected/df82c6a7-c94c-4ed8-9034-45c7515ce78a-kube-api-access-8hfpt\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914396 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swh9v\" (UniqueName: \"kubernetes.io/projected/957d72db-4cb4-4e97-bb11-2f25eb03f259-kube-api-access-swh9v\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914438 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-certificates\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914514 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4150c40f-0b19-4f81-b11c-6b19b25922b1-images\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914563 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32375b53-7eef-44a8-96f8-e422ff17dd63-auth-proxy-config\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914604 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914650 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-oauth-config\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914726 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914772 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-tls\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914817 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914862 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86eff4e6-938a-48fa-a116-c46597bc0868-secret-volume\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914906 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38670a3a-3b58-403a-92e0-f14d5dda51f3-metrics-tls\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.914960 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915003 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-stats-auth\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915041 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-policies\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915076 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-client-ca\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915112 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04db531f-adc2-4158-8d64-109774b8115e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915125 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915149 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p86d9\" (UniqueName: \"kubernetes.io/projected/04db531f-adc2-4158-8d64-109774b8115e-kube-api-access-p86d9\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915187 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/23113625-7519-44a3-b330-50463a4800c4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915238 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76nh4\" (UniqueName: \"kubernetes.io/projected/38670a3a-3b58-403a-92e0-f14d5dda51f3-kube-api-access-76nh4\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915277 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-config\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/32375b53-7eef-44a8-96f8-e422ff17dd63-images\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915363 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915402 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915441 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02b8e549-abd9-4adb-a77a-f2af6305625a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q95gm\" (UID: \"02b8e549-abd9-4adb-a77a-f2af6305625a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915516 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915559 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-dir\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915603 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915646 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-config\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915681 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-ca\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915721 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h64xv\" (UniqueName: \"kubernetes.io/projected/099dbc78-3133-444f-b40a-b931b090a2d9-kube-api-access-h64xv\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915761 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-service-ca\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915799 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23113625-7519-44a3-b330-50463a4800c4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915833 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-srv-cert\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915925 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df82c6a7-c94c-4ed8-9034-45c7515ce78a-serving-cert\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915962 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4150c40f-0b19-4f81-b11c-6b19b25922b1-config\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.915995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-profile-collector-cert\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916040 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w5k8\" (UniqueName: \"kubernetes.io/projected/4ef0fdd5-8e73-4320-8021-e6f28b26f248-kube-api-access-9w5k8\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916075 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-config\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r75ls\" (UniqueName: \"kubernetes.io/projected/fc36d689-70da-40c4-93ae-f5e35e414999-kube-api-access-r75ls\") pod \"downloads-7954f5f757-754vb\" (UID: \"fc36d689-70da-40c4-93ae-f5e35e414999\") " pod="openshift-console/downloads-7954f5f757-754vb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916188 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916235 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-serving-cert\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916274 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz79z\" (UniqueName: \"kubernetes.io/projected/02b8e549-abd9-4adb-a77a-f2af6305625a-kube-api-access-hz79z\") pod \"control-plane-machine-set-operator-78cbb6b69f-q95gm\" (UID: \"02b8e549-abd9-4adb-a77a-f2af6305625a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916315 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ef0fdd5-8e73-4320-8021-e6f28b26f248-service-ca-bundle\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916355 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zcmb\" (UniqueName: \"kubernetes.io/projected/32375b53-7eef-44a8-96f8-e422ff17dd63-kube-api-access-5zcmb\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916400 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916443 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df82c6a7-c94c-4ed8-9034-45c7515ce78a-trusted-ca\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916504 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-oauth-serving-cert\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916543 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-bound-sa-token\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916640 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/099dbc78-3133-444f-b40a-b931b090a2d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916666 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lqqp7"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916690 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52b58\" (UniqueName: \"kubernetes.io/projected/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-kube-api-access-52b58\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916773 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxcmc\" (UniqueName: \"kubernetes.io/projected/568e82c4-d3cb-4501-a7f6-2343d01f0d60-kube-api-access-gxcmc\") pod \"package-server-manager-789f6589d5-7dphk\" (UID: \"568e82c4-d3cb-4501-a7f6-2343d01f0d60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916808 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916841 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916893 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a7cd2efb-c621-4ea7-b1d2-ea923968f737-machine-approver-tls\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916935 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdx2b\" (UniqueName: \"kubernetes.io/projected/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-kube-api-access-vdx2b\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.916971 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/099dbc78-3133-444f-b40a-b931b090a2d9-proxy-tls\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917002 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wdct\" (UniqueName: \"kubernetes.io/projected/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-kube-api-access-4wdct\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917038 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44e3ba1-e673-4377-a492-ee70dfac0406-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917075 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c696fd0-572e-4fcf-bd2b-66cda008888b-serving-cert\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917111 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23113625-7519-44a3-b330-50463a4800c4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38670a3a-3b58-403a-92e0-f14d5dda51f3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917186 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44e3ba1-e673-4377-a492-ee70dfac0406-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917225 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f6cd201-6a64-4058-9e26-946d60f89c38-webhook-cert\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917256 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917291 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6dnr\" (UniqueName: \"kubernetes.io/projected/23113625-7519-44a3-b330-50463a4800c4-kube-api-access-h6dnr\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917328 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75k2k\" (UniqueName: \"kubernetes.io/projected/4150c40f-0b19-4f81-b11c-6b19b25922b1-kube-api-access-75k2k\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917363 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4150c40f-0b19-4f81-b11c-6b19b25922b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917407 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7cd2efb-c621-4ea7-b1d2-ea923968f737-auth-proxy-config\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917454 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7cd2efb-c621-4ea7-b1d2-ea923968f737-config\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3641da-8c41-4082-8d46-a2590ad60dbd-metrics-tls\") pod \"dns-operator-744455d44c-nxgdx\" (UID: \"fd3641da-8c41-4082-8d46-a2590ad60dbd\") " pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917548 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917587 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df82c6a7-c94c-4ed8-9034-45c7515ce78a-config\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917639 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9f6cd201-6a64-4058-9e26-946d60f89c38-tmpfs\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917673 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917707 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-client\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917739 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f6cd201-6a64-4058-9e26-946d60f89c38-apiservice-cert\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917775 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwxk\" (UniqueName: \"kubernetes.io/projected/fd3641da-8c41-4082-8d46-a2590ad60dbd-kube-api-access-kwwxk\") pod \"dns-operator-744455d44c-nxgdx\" (UID: \"fd3641da-8c41-4082-8d46-a2590ad60dbd\") " pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917808 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917819 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-94nv4"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917842 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgx78\" (UniqueName: \"kubernetes.io/projected/a44e3ba1-e673-4377-a492-ee70dfac0406-kube-api-access-qgx78\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917878 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917909 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917943 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq6x9\" (UniqueName: \"kubernetes.io/projected/5c696fd0-572e-4fcf-bd2b-66cda008888b-kube-api-access-gq6x9\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.917979 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8fjd\" (UniqueName: \"kubernetes.io/projected/a7cd2efb-c621-4ea7-b1d2-ea923968f737-kube-api-access-n8fjd\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918012 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-trusted-ca\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918044 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918082 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-trusted-ca-bundle\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918140 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8mjk\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-kube-api-access-w8mjk\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918177 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-cert\") pod \"ingress-canary-2hbf2\" (UID: \"1accac4f-68a4-4bf9-92a6-06d8c1c36db9\") " pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918213 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqsnb\" (UniqueName: \"kubernetes.io/projected/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-kube-api-access-wqsnb\") pod \"ingress-canary-2hbf2\" (UID: \"1accac4f-68a4-4bf9-92a6-06d8c1c36db9\") " pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918251 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-service-ca\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918282 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kf56\" (UniqueName: \"kubernetes.io/projected/471c7823-e3ea-4b73-9034-3ba5fc123190-kube-api-access-6kf56\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918317 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/568e82c4-d3cb-4501-a7f6-2343d01f0d60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7dphk\" (UID: \"568e82c4-d3cb-4501-a7f6-2343d01f0d60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918387 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32375b53-7eef-44a8-96f8-e422ff17dd63-proxy-tls\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918418 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2xhg\" (UniqueName: \"kubernetes.io/projected/92c30e77-e0cc-4e3a-a38e-0856daffccd2-kube-api-access-k2xhg\") pod \"migrator-59844c95c7-zp9bm\" (UID: \"92c30e77-e0cc-4e3a-a38e-0856daffccd2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.918459 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-config\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.919621 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-tls\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.919679 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.919958 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/471c7823-e3ea-4b73-9034-3ba5fc123190-serving-cert\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.920699 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-config\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.920871 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.921014 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-certificates\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: E1009 19:30:53.921597 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.421579586 +0000 UTC m=+139.953547075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.921827 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44e3ba1-e673-4377-a492-ee70dfac0406-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.922031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-policies\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.922858 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7cd2efb-c621-4ea7-b1d2-ea923968f737-config\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.922942 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7cd2efb-c621-4ea7-b1d2-ea923968f737-auth-proxy-config\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.923262 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-client-ca\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.923666 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a7cd2efb-c621-4ea7-b1d2-ea923968f737-machine-approver-tls\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.923713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.923805 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.923915 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2hbf2"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.924074 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-dir\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.924353 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df82c6a7-c94c-4ed8-9034-45c7515ce78a-trusted-ca\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.924879 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-oauth-config\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.924966 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.925116 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-trusted-ca-bundle\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.925295 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-service-ca\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.925503 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.925910 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3641da-8c41-4082-8d46-a2590ad60dbd-metrics-tls\") pod \"dns-operator-744455d44c-nxgdx\" (UID: \"fd3641da-8c41-4082-8d46-a2590ad60dbd\") " pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.926200 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-config\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.926301 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df82c6a7-c94c-4ed8-9034-45c7515ce78a-config\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.926351 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.926658 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.926952 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.927261 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.927340 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c696fd0-572e-4fcf-bd2b-66cda008888b-serving-cert\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.927577 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-service-ca\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.928107 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-trusted-ca\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.928390 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-config\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.929116 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.929287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-ca\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.929529 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df82c6a7-c94c-4ed8-9034-45c7515ce78a-serving-cert\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.929912 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-oauth-serving-cert\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.932119 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.932278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.932997 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.933165 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.934582 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/471c7823-e3ea-4b73-9034-3ba5fc123190-etcd-client\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.934719 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.934854 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-serving-cert\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.935211 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.935303 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.936275 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.936337 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.938023 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ntw5v"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.939812 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nl9hz"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.941080 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zc7nw"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.941853 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.942484 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g48tm"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.943652 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.953288 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.953345 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.953361 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.944938 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44e3ba1-e673-4377-a492-ee70dfac0406-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.958712 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g48tm"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.960950 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.966175 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h9vb8"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.967385 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.968923 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h9vb8"] Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.974099 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 19:30:53 crc kubenswrapper[4907]: I1009 19:30:53.994408 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.013045 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.019147 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.019304 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.519276379 +0000 UTC m=+140.051243868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.019624 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9f6cd201-6a64-4058-9e26-946d60f89c38-tmpfs\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.019695 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f6cd201-6a64-4058-9e26-946d60f89c38-apiservice-cert\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.019813 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-cert\") pod \"ingress-canary-2hbf2\" (UID: \"1accac4f-68a4-4bf9-92a6-06d8c1c36db9\") " pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.019842 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqsnb\" (UniqueName: \"kubernetes.io/projected/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-kube-api-access-wqsnb\") pod \"ingress-canary-2hbf2\" (UID: \"1accac4f-68a4-4bf9-92a6-06d8c1c36db9\") " pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.019888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.019915 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/568e82c4-d3cb-4501-a7f6-2343d01f0d60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7dphk\" (UID: \"568e82c4-d3cb-4501-a7f6-2343d01f0d60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020014 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32375b53-7eef-44a8-96f8-e422ff17dd63-proxy-tls\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020341 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xhg\" (UniqueName: \"kubernetes.io/projected/92c30e77-e0cc-4e3a-a38e-0856daffccd2-kube-api-access-k2xhg\") pod \"migrator-59844c95c7-zp9bm\" (UID: \"92c30e77-e0cc-4e3a-a38e-0856daffccd2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020487 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04db531f-adc2-4158-8d64-109774b8115e-serving-cert\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020610 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq8dk\" (UniqueName: \"kubernetes.io/projected/9f6cd201-6a64-4058-9e26-946d60f89c38-kube-api-access-tq8dk\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020688 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxz4c\" (UniqueName: \"kubernetes.io/projected/86eff4e6-938a-48fa-a116-c46597bc0868-kube-api-access-fxz4c\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020742 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38670a3a-3b58-403a-92e0-f14d5dda51f3-trusted-ca\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020792 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020850 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4150c40f-0b19-4f81-b11c-6b19b25922b1-images\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020873 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32375b53-7eef-44a8-96f8-e422ff17dd63-auth-proxy-config\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020918 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86eff4e6-938a-48fa-a116-c46597bc0868-secret-volume\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.020944 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38670a3a-3b58-403a-92e0-f14d5dda51f3-metrics-tls\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021037 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021087 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04db531f-adc2-4158-8d64-109774b8115e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p86d9\" (UniqueName: \"kubernetes.io/projected/04db531f-adc2-4158-8d64-109774b8115e-kube-api-access-p86d9\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/23113625-7519-44a3-b330-50463a4800c4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021175 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76nh4\" (UniqueName: \"kubernetes.io/projected/38670a3a-3b58-403a-92e0-f14d5dda51f3-kube-api-access-76nh4\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021196 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-config\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021241 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/32375b53-7eef-44a8-96f8-e422ff17dd63-images\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021265 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02b8e549-abd9-4adb-a77a-f2af6305625a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q95gm\" (UID: \"02b8e549-abd9-4adb-a77a-f2af6305625a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021458 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h64xv\" (UniqueName: \"kubernetes.io/projected/099dbc78-3133-444f-b40a-b931b090a2d9-kube-api-access-h64xv\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23113625-7519-44a3-b330-50463a4800c4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021555 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-srv-cert\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021583 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4150c40f-0b19-4f81-b11c-6b19b25922b1-config\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021614 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04db531f-adc2-4158-8d64-109774b8115e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021622 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-profile-collector-cert\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.021729 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.521690061 +0000 UTC m=+140.053657590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021812 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r75ls\" (UniqueName: \"kubernetes.io/projected/fc36d689-70da-40c4-93ae-f5e35e414999-kube-api-access-r75ls\") pod \"downloads-7954f5f757-754vb\" (UID: \"fc36d689-70da-40c4-93ae-f5e35e414999\") " pod="openshift-console/downloads-7954f5f757-754vb" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021906 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz79z\" (UniqueName: \"kubernetes.io/projected/02b8e549-abd9-4adb-a77a-f2af6305625a-kube-api-access-hz79z\") pod \"control-plane-machine-set-operator-78cbb6b69f-q95gm\" (UID: \"02b8e549-abd9-4adb-a77a-f2af6305625a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021935 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4150c40f-0b19-4f81-b11c-6b19b25922b1-images\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021979 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zcmb\" (UniqueName: \"kubernetes.io/projected/32375b53-7eef-44a8-96f8-e422ff17dd63-kube-api-access-5zcmb\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.022119 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/099dbc78-3133-444f-b40a-b931b090a2d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.022178 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxcmc\" (UniqueName: \"kubernetes.io/projected/568e82c4-d3cb-4501-a7f6-2343d01f0d60-kube-api-access-gxcmc\") pod \"package-server-manager-789f6589d5-7dphk\" (UID: \"568e82c4-d3cb-4501-a7f6-2343d01f0d60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.022231 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.022281 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.022362 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdx2b\" (UniqueName: \"kubernetes.io/projected/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-kube-api-access-vdx2b\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.022414 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/099dbc78-3133-444f-b40a-b931b090a2d9-proxy-tls\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.021905 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/32375b53-7eef-44a8-96f8-e422ff17dd63-auth-proxy-config\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.023018 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4150c40f-0b19-4f81-b11c-6b19b25922b1-config\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.023195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9f6cd201-6a64-4058-9e26-946d60f89c38-tmpfs\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.024261 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23113625-7519-44a3-b330-50463a4800c4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.024357 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04db531f-adc2-4158-8d64-109774b8115e-serving-cert\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.024553 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wdct\" (UniqueName: \"kubernetes.io/projected/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-kube-api-access-4wdct\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.024751 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23113625-7519-44a3-b330-50463a4800c4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.024898 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38670a3a-3b58-403a-92e0-f14d5dda51f3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.025024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f6cd201-6a64-4058-9e26-946d60f89c38-webhook-cert\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.025153 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.025059 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/23113625-7519-44a3-b330-50463a4800c4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.025273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6dnr\" (UniqueName: \"kubernetes.io/projected/23113625-7519-44a3-b330-50463a4800c4-kube-api-access-h6dnr\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.025344 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75k2k\" (UniqueName: \"kubernetes.io/projected/4150c40f-0b19-4f81-b11c-6b19b25922b1-kube-api-access-75k2k\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.025390 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4150c40f-0b19-4f81-b11c-6b19b25922b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.026034 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/099dbc78-3133-444f-b40a-b931b090a2d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.029666 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4150c40f-0b19-4f81-b11c-6b19b25922b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.033324 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.053636 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.073007 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.082969 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/32375b53-7eef-44a8-96f8-e422ff17dd63-images\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.093281 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.114309 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.124523 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/32375b53-7eef-44a8-96f8-e422ff17dd63-proxy-tls\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.125874 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.126197 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.626171057 +0000 UTC m=+140.158138546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.126636 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.127064 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.62705454 +0000 UTC m=+140.159022029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.133423 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.153099 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.173524 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.185845 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-srv-cert\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.193555 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.205431 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86eff4e6-938a-48fa-a116-c46597bc0868-secret-volume\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.206027 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-profile-collector-cert\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.213785 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.228421 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.228650 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.728615292 +0000 UTC m=+140.260582781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.229337 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.229814 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.729797822 +0000 UTC m=+140.261765371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.253795 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.273866 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.293426 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.298690 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.313339 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.330535 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.330806 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.830761867 +0000 UTC m=+140.362729356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.331210 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.332070 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.83203834 +0000 UTC m=+140.364005839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.343318 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.353313 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.354430 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.366532 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/099dbc78-3133-444f-b40a-b931b090a2d9-proxy-tls\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.373987 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.392701 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.405033 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/568e82c4-d3cb-4501-a7f6-2343d01f0d60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7dphk\" (UID: \"568e82c4-d3cb-4501-a7f6-2343d01f0d60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.415280 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.432453 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.432873 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.932833892 +0000 UTC m=+140.464801431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.433400 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.433635 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.434105 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:54.934085484 +0000 UTC m=+140.466053013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.447610 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02b8e549-abd9-4adb-a77a-f2af6305625a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q95gm\" (UID: \"02b8e549-abd9-4adb-a77a-f2af6305625a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.454192 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.459589 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9f6cd201-6a64-4058-9e26-946d60f89c38-webhook-cert\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.465925 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9f6cd201-6a64-4058-9e26-946d60f89c38-apiservice-cert\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.474423 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.494375 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.514765 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.534070 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.534530 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.534679 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.03465935 +0000 UTC m=+140.566626839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.535061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.535488 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.035457151 +0000 UTC m=+140.567424640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.553564 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.574888 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.588192 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38670a3a-3b58-403a-92e0-f14d5dda51f3-metrics-tls\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.594300 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.624856 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.633338 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38670a3a-3b58-403a-92e0-f14d5dda51f3-trusted-ca\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.633669 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.636931 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.637152 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.137118695 +0000 UTC m=+140.669086184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.637497 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.637898 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.137890725 +0000 UTC m=+140.669858214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.654052 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.674708 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.695413 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.714241 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.736861 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.738378 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.738982 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.238963914 +0000 UTC m=+140.770931403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.754200 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.774779 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.794752 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.813796 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.831865 4907 request.go:700] Waited for 1.000414186s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.833627 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.839635 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.840257 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.340240198 +0000 UTC m=+140.872207697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.854013 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.875408 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.886747 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.894204 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.902920 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-config\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.913525 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.921456 4907 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.921673 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-stats-auth podName:4ef0fdd5-8e73-4320-8021-e6f28b26f248 nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.421646463 +0000 UTC m=+140.953613962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-stats-auth") pod "router-default-5444994796-mzwdh" (UID: "4ef0fdd5-8e73-4320-8021-e6f28b26f248") : failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.923613 4907 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.923662 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate podName:4ef0fdd5-8e73-4320-8021-e6f28b26f248 nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.423650625 +0000 UTC m=+140.955618114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate") pod "router-default-5444994796-mzwdh" (UID: "4ef0fdd5-8e73-4320-8021-e6f28b26f248") : failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.924837 4907 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.924987 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs podName:4ef0fdd5-8e73-4320-8021-e6f28b26f248 nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.424933488 +0000 UTC m=+140.956900997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs") pod "router-default-5444994796-mzwdh" (UID: "4ef0fdd5-8e73-4320-8021-e6f28b26f248") : failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.928494 4907 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.928581 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ef0fdd5-8e73-4320-8021-e6f28b26f248-service-ca-bundle podName:4ef0fdd5-8e73-4320-8021-e6f28b26f248 nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.428560241 +0000 UTC m=+140.960527930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4ef0fdd5-8e73-4320-8021-e6f28b26f248-service-ca-bundle") pod "router-default-5444994796-mzwdh" (UID: "4ef0fdd5-8e73-4320-8021-e6f28b26f248") : failed to sync configmap cache: timed out waiting for the condition Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.933012 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.941056 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.941418 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.441396649 +0000 UTC m=+140.973364148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.941682 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:54 crc kubenswrapper[4907]: E1009 19:30:54.943237 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.443214456 +0000 UTC m=+140.975181945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.953563 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.975252 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 09 19:30:54 crc kubenswrapper[4907]: I1009 19:30:54.992824 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.014284 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.021050 4907 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.021154 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-cert podName:1accac4f-68a4-4bf9-92a6-06d8c1c36db9 nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.521125532 +0000 UTC m=+141.053093041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-cert") pod "ingress-canary-2hbf2" (UID: "1accac4f-68a4-4bf9-92a6-06d8c1c36db9") : failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.021380 4907 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.021666 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume podName:86eff4e6-938a-48fa-a116-c46597bc0868 nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.521639695 +0000 UTC m=+141.053607184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume") pod "collect-profiles-29333970-4kdct" (UID: "86eff4e6-938a-48fa-a116-c46597bc0868") : failed to sync configmap cache: timed out waiting for the condition Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.033768 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.042882 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.043003 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.542977182 +0000 UTC m=+141.074944681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.043189 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.043595 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.543584547 +0000 UTC m=+141.075552036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.053917 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.074154 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.092919 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.113812 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.140778 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.144288 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.144453 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.64442349 +0000 UTC m=+141.176390989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.144666 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.145371 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.645344594 +0000 UTC m=+141.177312273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.154353 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.174033 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.194701 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.213938 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.234593 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.245973 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.246093 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.746072804 +0000 UTC m=+141.278040303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.246458 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.246820 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.746809443 +0000 UTC m=+141.278776932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.253069 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.273410 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.293530 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.314009 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.333825 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.347654 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.347965 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.847929694 +0000 UTC m=+141.379897183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.348612 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.349263 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.849236967 +0000 UTC m=+141.381204496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.354595 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.374618 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.393351 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.414032 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.434504 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.450332 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.450592 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.950559153 +0000 UTC m=+141.482526642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.450708 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.451232 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-stats-auth\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.451375 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.451551 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.451632 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ef0fdd5-8e73-4320-8021-e6f28b26f248-service-ca-bundle\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.451819 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:55.951797564 +0000 UTC m=+141.483765153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.453937 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.475603 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.494588 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.514852 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.534064 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.553230 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.553432 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.053399287 +0000 UTC m=+141.585366776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.553522 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-cert\") pod \"ingress-canary-2hbf2\" (UID: \"1accac4f-68a4-4bf9-92a6-06d8c1c36db9\") " pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.553654 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.553780 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.554344 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.054322451 +0000 UTC m=+141.586289940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.554726 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.555035 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.559374 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-cert\") pod \"ingress-canary-2hbf2\" (UID: \"1accac4f-68a4-4bf9-92a6-06d8c1c36db9\") " pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.608479 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w62l9\" (UniqueName: \"kubernetes.io/projected/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-kube-api-access-w62l9\") pod \"oauth-openshift-558db77b4-hztkx\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.643251 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hfpt\" (UniqueName: \"kubernetes.io/projected/df82c6a7-c94c-4ed8-9034-45c7515ce78a-kube-api-access-8hfpt\") pod \"console-operator-58897d9998-qs6lp\" (UID: \"df82c6a7-c94c-4ed8-9034-45c7515ce78a\") " pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.648944 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swh9v\" (UniqueName: \"kubernetes.io/projected/957d72db-4cb4-4e97-bb11-2f25eb03f259-kube-api-access-swh9v\") pod \"console-f9d7485db-2fnwq\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.652169 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.655113 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.655554 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.155507813 +0000 UTC m=+141.687475422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.656148 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.658352 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.158312954 +0000 UTC m=+141.690280443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.665386 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.673730 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwxk\" (UniqueName: \"kubernetes.io/projected/fd3641da-8c41-4082-8d46-a2590ad60dbd-kube-api-access-kwwxk\") pod \"dns-operator-744455d44c-nxgdx\" (UID: \"fd3641da-8c41-4082-8d46-a2590ad60dbd\") " pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.690404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgx78\" (UniqueName: \"kubernetes.io/projected/a44e3ba1-e673-4377-a492-ee70dfac0406-kube-api-access-qgx78\") pod \"openshift-controller-manager-operator-756b6f6bc6-bsxgx\" (UID: \"a44e3ba1-e673-4377-a492-ee70dfac0406\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.727629 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-bound-sa-token\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.757413 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.757619 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.257580927 +0000 UTC m=+141.789548416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.757823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.758603 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8fjd\" (UniqueName: \"kubernetes.io/projected/a7cd2efb-c621-4ea7-b1d2-ea923968f737-kube-api-access-n8fjd\") pod \"machine-approver-56656f9798-rqls2\" (UID: \"a7cd2efb-c621-4ea7-b1d2-ea923968f737\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.758909 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.258883901 +0000 UTC m=+141.790851430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.771892 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.781067 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8mjk\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-kube-api-access-w8mjk\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.791888 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52b58\" (UniqueName: \"kubernetes.io/projected/532e3657-8f58-4b6b-8b4d-6d0d6a49d451-kube-api-access-52b58\") pod \"openshift-apiserver-operator-796bbdcf4f-hhfns\" (UID: \"532e3657-8f58-4b6b-8b4d-6d0d6a49d451\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.810204 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.811283 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq6x9\" (UniqueName: \"kubernetes.io/projected/5c696fd0-572e-4fcf-bd2b-66cda008888b-kube-api-access-gq6x9\") pod \"controller-manager-879f6c89f-m47fb\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.815914 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.831923 4907 request.go:700] Waited for 1.889773488s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.833602 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kf56\" (UniqueName: \"kubernetes.io/projected/471c7823-e3ea-4b73-9034-3ba5fc123190-kube-api-access-6kf56\") pod \"etcd-operator-b45778765-wz5xt\" (UID: \"471c7823-e3ea-4b73-9034-3ba5fc123190\") " pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.834441 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 09 19:30:55 crc kubenswrapper[4907]: W1009 19:30:55.849856 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7cd2efb_c621_4ea7_b1d2_ea923968f737.slice/crio-5dc12673dd33fbb90d913ec99bf2c188c0129a0c36e55698de25f33b390f278e WatchSource:0}: Error finding container 5dc12673dd33fbb90d913ec99bf2c188c0129a0c36e55698de25f33b390f278e: Status 404 returned error can't find the container with id 5dc12673dd33fbb90d913ec99bf2c188c0129a0c36e55698de25f33b390f278e Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.857934 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.860635 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.860961 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.360938755 +0000 UTC m=+141.892906244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.861595 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.862074 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.362066244 +0000 UTC m=+141.894033733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.873485 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.887107 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.897904 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.905460 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qs6lp"] Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.912879 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.913584 4907 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 09 19:30:55 crc kubenswrapper[4907]: W1009 19:30:55.930269 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf82c6a7_c94c_4ed8_9034_45c7515ce78a.slice/crio-243b6000aa79172cdaa8fdccc86737b381ed5e10658b92ff857273e933dc1dbf WatchSource:0}: Error finding container 243b6000aa79172cdaa8fdccc86737b381ed5e10658b92ff857273e933dc1dbf: Status 404 returned error can't find the container with id 243b6000aa79172cdaa8fdccc86737b381ed5e10658b92ff857273e933dc1dbf Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.931297 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hztkx"] Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.932698 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.933354 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.947917 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.953203 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" event={"ID":"a7cd2efb-c621-4ea7-b1d2-ea923968f737","Type":"ContainerStarted","Data":"5dc12673dd33fbb90d913ec99bf2c188c0129a0c36e55698de25f33b390f278e"} Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.953401 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.954503 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qs6lp" event={"ID":"df82c6a7-c94c-4ed8-9034-45c7515ce78a","Type":"ContainerStarted","Data":"243b6000aa79172cdaa8fdccc86737b381ed5e10658b92ff857273e933dc1dbf"} Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.966986 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.967764 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.4677015 +0000 UTC m=+141.999668989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.970142 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:55 crc kubenswrapper[4907]: E1009 19:30:55.970731 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.470723527 +0000 UTC m=+142.002691016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.977008 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 09 19:30:55 crc kubenswrapper[4907]: I1009 19:30:55.999020 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.024910 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nxgdx"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.034300 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqsnb\" (UniqueName: \"kubernetes.io/projected/1accac4f-68a4-4bf9-92a6-06d8c1c36db9-kube-api-access-wqsnb\") pod \"ingress-canary-2hbf2\" (UID: \"1accac4f-68a4-4bf9-92a6-06d8c1c36db9\") " pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.058419 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2xhg\" (UniqueName: \"kubernetes.io/projected/92c30e77-e0cc-4e3a-a38e-0856daffccd2-kube-api-access-k2xhg\") pod \"migrator-59844c95c7-zp9bm\" (UID: \"92c30e77-e0cc-4e3a-a38e-0856daffccd2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.073545 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.074500 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.574457245 +0000 UTC m=+142.106424734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.085127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq8dk\" (UniqueName: \"kubernetes.io/projected/9f6cd201-6a64-4058-9e26-946d60f89c38-kube-api-access-tq8dk\") pod \"packageserver-d55dfcdfc-6vslh\" (UID: \"9f6cd201-6a64-4058-9e26-946d60f89c38\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.101033 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxz4c\" (UniqueName: \"kubernetes.io/projected/86eff4e6-938a-48fa-a116-c46597bc0868-kube-api-access-fxz4c\") pod \"collect-profiles-29333970-4kdct\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.125858 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.126865 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p86d9\" (UniqueName: \"kubernetes.io/projected/04db531f-adc2-4158-8d64-109774b8115e-kube-api-access-p86d9\") pod \"openshift-config-operator-7777fb866f-94nv4\" (UID: \"04db531f-adc2-4158-8d64-109774b8115e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.133007 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2fnwq"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.135751 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h64xv\" (UniqueName: \"kubernetes.io/projected/099dbc78-3133-444f-b40a-b931b090a2d9-kube-api-access-h64xv\") pod \"machine-config-controller-84d6567774-s9h47\" (UID: \"099dbc78-3133-444f-b40a-b931b090a2d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.152974 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76nh4\" (UniqueName: \"kubernetes.io/projected/38670a3a-3b58-403a-92e0-f14d5dda51f3-kube-api-access-76nh4\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:56 crc kubenswrapper[4907]: W1009 19:30:56.166901 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd3641da_8c41_4082_8d46_a2590ad60dbd.slice/crio-687286f7166b56a9ec80ac95dd300612b71d18b5de4f417345036b91f1fa9d9c WatchSource:0}: Error finding container 687286f7166b56a9ec80ac95dd300612b71d18b5de4f417345036b91f1fa9d9c: Status 404 returned error can't find the container with id 687286f7166b56a9ec80ac95dd300612b71d18b5de4f417345036b91f1fa9d9c Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.178040 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m47fb"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.178187 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.178591 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.178722 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r75ls\" (UniqueName: \"kubernetes.io/projected/fc36d689-70da-40c4-93ae-f5e35e414999-kube-api-access-r75ls\") pod \"downloads-7954f5f757-754vb\" (UID: \"fc36d689-70da-40c4-93ae-f5e35e414999\") " pod="openshift-console/downloads-7954f5f757-754vb" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.179396 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.679374142 +0000 UTC m=+142.211341631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.190345 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zcmb\" (UniqueName: \"kubernetes.io/projected/32375b53-7eef-44a8-96f8-e422ff17dd63-kube-api-access-5zcmb\") pod \"machine-config-operator-74547568cd-99plh\" (UID: \"32375b53-7eef-44a8-96f8-e422ff17dd63\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.200393 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.210067 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.213039 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz79z\" (UniqueName: \"kubernetes.io/projected/02b8e549-abd9-4adb-a77a-f2af6305625a-kube-api-access-hz79z\") pod \"control-plane-machine-set-operator-78cbb6b69f-q95gm\" (UID: \"02b8e549-abd9-4adb-a77a-f2af6305625a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.231254 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxcmc\" (UniqueName: \"kubernetes.io/projected/568e82c4-d3cb-4501-a7f6-2343d01f0d60-kube-api-access-gxcmc\") pod \"package-server-manager-789f6589d5-7dphk\" (UID: \"568e82c4-d3cb-4501-a7f6-2343d01f0d60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.249171 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2hbf2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.249909 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdx2b\" (UniqueName: \"kubernetes.io/projected/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-kube-api-access-vdx2b\") pod \"marketplace-operator-79b997595-6rv64\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.272814 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ecfaee0-5af7-4d8e-8ae1-fb923602c05b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-58hw7\" (UID: \"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.279203 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.279491 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.779441876 +0000 UTC m=+142.311409365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.279846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.280266 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.780259197 +0000 UTC m=+142.312226686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.284600 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.290495 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wdct\" (UniqueName: \"kubernetes.io/projected/c153b2ce-efcd-4d73-8b4f-8e67322e88c5-kube-api-access-4wdct\") pod \"catalog-operator-68c6474976-4g4dt\" (UID: \"c153b2ce-efcd-4d73-8b4f-8e67322e88c5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.297962 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-wz5xt"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.306687 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.314895 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23113625-7519-44a3-b330-50463a4800c4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.324901 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-754vb" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.331929 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38670a3a-3b58-403a-92e0-f14d5dda51f3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6tz8n\" (UID: \"38670a3a-3b58-403a-92e0-f14d5dda51f3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.350339 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75k2k\" (UniqueName: \"kubernetes.io/projected/4150c40f-0b19-4f81-b11c-6b19b25922b1-kube-api-access-75k2k\") pod \"machine-api-operator-5694c8668f-lqqp7\" (UID: \"4150c40f-0b19-4f81-b11c-6b19b25922b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.378184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.383383 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.383911 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.883889461 +0000 UTC m=+142.415856950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.386498 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.392936 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.396827 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.398990 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.399025 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6dnr\" (UniqueName: \"kubernetes.io/projected/23113625-7519-44a3-b330-50463a4800c4-kube-api-access-h6dnr\") pod \"cluster-image-registry-operator-dc59b4c8b-5l6b5\" (UID: \"23113625-7519-44a3-b330-50463a4800c4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.401354 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.406401 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-stats-auth\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.409057 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.416280 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.416442 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.423343 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ef0fdd5-8e73-4320-8021-e6f28b26f248-service-ca-bundle\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.436761 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.450896 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w5k8\" (UniqueName: \"kubernetes.io/projected/4ef0fdd5-8e73-4320-8021-e6f28b26f248-kube-api-access-9w5k8\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.450976 4907 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.451032 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate podName:4ef0fdd5-8e73-4320-8021-e6f28b26f248 nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.451017161 +0000 UTC m=+142.982984650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate") pod "router-default-5444994796-mzwdh" (UID: "4ef0fdd5-8e73-4320-8021-e6f28b26f248") : failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.452883 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.453881 4907 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.454051 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs podName:4ef0fdd5-8e73-4320-8021-e6f28b26f248 nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.454017628 +0000 UTC m=+142.985985107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs") pod "router-default-5444994796-mzwdh" (UID: "4ef0fdd5-8e73-4320-8021-e6f28b26f248") : failed to sync secret cache: timed out waiting for the condition Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.454880 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.461853 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.473458 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.485336 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.488062 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.488409 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:56.988374988 +0000 UTC m=+142.520342497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: W1009 19:30:56.495769 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c30e77_e0cc_4e3a_a38e_0856daffccd2.slice/crio-e5550666300482b3c99f56cdc7de3e98b4ef974b371b1ae845b3d32a8d7554e8 WatchSource:0}: Error finding container e5550666300482b3c99f56cdc7de3e98b4ef974b371b1ae845b3d32a8d7554e8: Status 404 returned error can't find the container with id e5550666300482b3c99f56cdc7de3e98b4ef974b371b1ae845b3d32a8d7554e8 Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.530707 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.580133 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.587205 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.588022 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.087968979 +0000 UTC m=+142.619936468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588141 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b7a19a90-b711-4db3-9d80-a111398d62d0-signing-cabundle\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588199 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g92zd\" (UniqueName: \"kubernetes.io/projected/23e92368-60d2-4a4c-b8ba-0c2464bb99e7-kube-api-access-g92zd\") pod \"multus-admission-controller-857f4d67dd-nnhgl\" (UID: \"23e92368-60d2-4a4c-b8ba-0c2464bb99e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588228 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7dpw\" (UniqueName: \"kubernetes.io/projected/aa4f766f-2388-46bb-8738-e09b42f189c6-kube-api-access-l7dpw\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588293 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91d8cef-721a-42b2-a7e7-17d555f9943a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588324 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b7a19a90-b711-4db3-9d80-a111398d62d0-signing-key\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dea93b7-2762-4b9c-ac3b-c7980070faf1-config\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588378 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-node-pullsecrets\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588448 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6107861-4366-4081-8f67-3f23d66590f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7f00b5d-8d8f-4246-a5c2-da2181451a39-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588599 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-serving-cert\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.588710 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-service-ca-bundle\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.589856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dea93b7-2762-4b9c-ac3b-c7980070faf1-serving-cert\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.589901 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-config\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.589942 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aa4f766f-2388-46bb-8738-e09b42f189c6-srv-cert\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.590198 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.590310 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23e92368-60d2-4a4c-b8ba-0c2464bb99e7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nnhgl\" (UID: \"23e92368-60d2-4a4c-b8ba-0c2464bb99e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.590364 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpcdf\" (UniqueName: \"kubernetes.io/projected/b7a19a90-b711-4db3-9d80-a111398d62d0-kube-api-access-lpcdf\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.590774 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-serving-cert\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.590816 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.090795031 +0000 UTC m=+142.622762540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.590857 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjg9s\" (UniqueName: \"kubernetes.io/projected/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-kube-api-access-rjg9s\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.590891 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-audit-dir\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.590910 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-etcd-serving-ca\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.590930 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591007 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-config\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591069 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-etcd-client\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/697f3940-9b2a-4fbc-9e99-933a3645bfc3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fn8wz\" (UID: \"697f3940-9b2a-4fbc-9e99-933a3645bfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-config\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591265 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sj6r\" (UniqueName: \"kubernetes.io/projected/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-kube-api-access-2sj6r\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591289 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-audit-policies\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591315 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591343 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7f00b5d-8d8f-4246-a5c2-da2181451a39-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591378 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxcr\" (UniqueName: \"kubernetes.io/projected/65313d6f-23ee-4269-ad8c-140fb200c3e5-kube-api-access-jwxcr\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aa4f766f-2388-46bb-8738-e09b42f189c6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591438 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f00b5d-8d8f-4246-a5c2-da2181451a39-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591479 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-image-import-ca\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591538 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6107861-4366-4081-8f67-3f23d66590f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqhtn\" (UniqueName: \"kubernetes.io/projected/c6107861-4366-4081-8f67-3f23d66590f2-kube-api-access-qqhtn\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591596 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591870 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591931 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91d8cef-721a-42b2-a7e7-17d555f9943a-config\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.591959 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e91d8cef-721a-42b2-a7e7-17d555f9943a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.592051 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-encryption-config\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.592082 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-client-ca\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.592535 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7btvl\" (UniqueName: \"kubernetes.io/projected/697f3940-9b2a-4fbc-9e99-933a3645bfc3-kube-api-access-7btvl\") pod \"cluster-samples-operator-665b6dd947-fn8wz\" (UID: \"697f3940-9b2a-4fbc-9e99-933a3645bfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.592586 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65313d6f-23ee-4269-ad8c-140fb200c3e5-serving-cert\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.592611 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-audit\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.592634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-serving-cert\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.597195 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-encryption-config\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.597256 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4pzg\" (UniqueName: \"kubernetes.io/projected/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-kube-api-access-w4pzg\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.597748 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkn6r\" (UniqueName: \"kubernetes.io/projected/2dea93b7-2762-4b9c-ac3b-c7980070faf1-kube-api-access-kkn6r\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.597787 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-etcd-client\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.598050 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-audit-dir\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.602120 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.612868 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2hbf2"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.703563 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.703988 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxcr\" (UniqueName: \"kubernetes.io/projected/65313d6f-23ee-4269-ad8c-140fb200c3e5-kube-api-access-jwxcr\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.704020 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aa4f766f-2388-46bb-8738-e09b42f189c6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.704060 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-image-import-ca\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.704080 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f00b5d-8d8f-4246-a5c2-da2181451a39-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.704108 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-mountpoint-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.704128 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqhtn\" (UniqueName: \"kubernetes.io/projected/c6107861-4366-4081-8f67-3f23d66590f2-kube-api-access-qqhtn\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.704151 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6107861-4366-4081-8f67-3f23d66590f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.704975 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705007 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705069 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91d8cef-721a-42b2-a7e7-17d555f9943a-config\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705088 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e91d8cef-721a-42b2-a7e7-17d555f9943a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705134 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-encryption-config\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-client-ca\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705208 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65313d6f-23ee-4269-ad8c-140fb200c3e5-serving-cert\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705235 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7btvl\" (UniqueName: \"kubernetes.io/projected/697f3940-9b2a-4fbc-9e99-933a3645bfc3-kube-api-access-7btvl\") pod \"cluster-samples-operator-665b6dd947-fn8wz\" (UID: \"697f3940-9b2a-4fbc-9e99-933a3645bfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705253 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-encryption-config\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705271 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-audit\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705491 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-serving-cert\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4pzg\" (UniqueName: \"kubernetes.io/projected/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-kube-api-access-w4pzg\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705529 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f864bfbb-0015-4f97-9ebc-3775f5694fdb-metrics-tls\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705553 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkn6r\" (UniqueName: \"kubernetes.io/projected/2dea93b7-2762-4b9c-ac3b-c7980070faf1-kube-api-access-kkn6r\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705599 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-etcd-client\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-audit-dir\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705662 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-csi-data-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6jv\" (UniqueName: \"kubernetes.io/projected/56197cfc-90ab-486e-8812-3ac1d97b9f2a-kube-api-access-8v6jv\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705710 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrdsn\" (UniqueName: \"kubernetes.io/projected/f864bfbb-0015-4f97-9ebc-3775f5694fdb-kube-api-access-lrdsn\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705746 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b7a19a90-b711-4db3-9d80-a111398d62d0-signing-cabundle\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705764 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-registration-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705788 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g92zd\" (UniqueName: \"kubernetes.io/projected/23e92368-60d2-4a4c-b8ba-0c2464bb99e7-kube-api-access-g92zd\") pod \"multus-admission-controller-857f4d67dd-nnhgl\" (UID: \"23e92368-60d2-4a4c-b8ba-0c2464bb99e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705805 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7dpw\" (UniqueName: \"kubernetes.io/projected/aa4f766f-2388-46bb-8738-e09b42f189c6-kube-api-access-l7dpw\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705821 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b7a19a90-b711-4db3-9d80-a111398d62d0-signing-key\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705852 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91d8cef-721a-42b2-a7e7-17d555f9943a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.705889 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.205866489 +0000 UTC m=+142.737833978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705937 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-plugins-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.705989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-node-pullsecrets\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dea93b7-2762-4b9c-ac3b-c7980070faf1-config\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg4xv\" (UniqueName: \"kubernetes.io/projected/581fdc11-a986-47b3-9a77-7e9ebe63b95d-kube-api-access-pg4xv\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706084 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6107861-4366-4081-8f67-3f23d66590f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7f00b5d-8d8f-4246-a5c2-da2181451a39-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706168 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-serving-cert\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706189 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/581fdc11-a986-47b3-9a77-7e9ebe63b95d-certs\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706222 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-service-ca-bundle\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706267 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-config\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706285 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dea93b7-2762-4b9c-ac3b-c7980070faf1-serving-cert\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706305 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/581fdc11-a986-47b3-9a77-7e9ebe63b95d-node-bootstrap-token\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aa4f766f-2388-46bb-8738-e09b42f189c6-srv-cert\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706356 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f864bfbb-0015-4f97-9ebc-3775f5694fdb-config-volume\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706406 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706425 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23e92368-60d2-4a4c-b8ba-0c2464bb99e7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nnhgl\" (UID: \"23e92368-60d2-4a4c-b8ba-0c2464bb99e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706446 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpcdf\" (UniqueName: \"kubernetes.io/projected/b7a19a90-b711-4db3-9d80-a111398d62d0-kube-api-access-lpcdf\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706497 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-serving-cert\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706514 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjg9s\" (UniqueName: \"kubernetes.io/projected/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-kube-api-access-rjg9s\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706544 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-etcd-serving-ca\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706559 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-socket-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706598 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-audit-dir\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706617 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-config\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706691 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-etcd-client\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706710 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/697f3940-9b2a-4fbc-9e99-933a3645bfc3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fn8wz\" (UID: \"697f3940-9b2a-4fbc-9e99-933a3645bfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706742 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-config\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706789 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sj6r\" (UniqueName: \"kubernetes.io/projected/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-kube-api-access-2sj6r\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706814 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-audit-policies\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.706876 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7f00b5d-8d8f-4246-a5c2-da2181451a39-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.709439 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6107861-4366-4081-8f67-3f23d66590f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.710481 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-service-ca-bundle\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.710867 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.210841507 +0000 UTC m=+142.742808996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.712033 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e91d8cef-721a-42b2-a7e7-17d555f9943a-config\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.712220 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-image-import-ca\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.713050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-etcd-serving-ca\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.713573 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-audit-dir\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.713923 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dea93b7-2762-4b9c-ac3b-c7980070faf1-config\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.714719 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-audit\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.715868 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-config\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.717665 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-client-ca\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.718110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b7a19a90-b711-4db3-9d80-a111398d62d0-signing-cabundle\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.718460 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-encryption-config\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.719912 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e91d8cef-721a-42b2-a7e7-17d555f9943a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.720159 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f00b5d-8d8f-4246-a5c2-da2181451a39-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.720398 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-serving-cert\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.722130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-config\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.722255 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-config\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.722564 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-audit-dir\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.722605 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-node-pullsecrets\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.722689 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-audit-policies\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.723440 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.723976 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.725611 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-754vb"] Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.725906 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.726795 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-etcd-client\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.731482 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.741742 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dea93b7-2762-4b9c-ac3b-c7980070faf1-serving-cert\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.742060 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6107861-4366-4081-8f67-3f23d66590f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.747982 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7f00b5d-8d8f-4246-a5c2-da2181451a39-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.748335 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-encryption-config\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.748657 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/697f3940-9b2a-4fbc-9e99-933a3645bfc3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fn8wz\" (UID: \"697f3940-9b2a-4fbc-9e99-933a3645bfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.753435 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-serving-cert\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.755208 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/aa4f766f-2388-46bb-8738-e09b42f189c6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.755228 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b7a19a90-b711-4db3-9d80-a111398d62d0-signing-key\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.755613 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-etcd-client\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.757700 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23e92368-60d2-4a4c-b8ba-0c2464bb99e7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-nnhgl\" (UID: \"23e92368-60d2-4a4c-b8ba-0c2464bb99e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.759046 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65313d6f-23ee-4269-ad8c-140fb200c3e5-serving-cert\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.759828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/aa4f766f-2388-46bb-8738-e09b42f189c6-srv-cert\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.761152 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e91d8cef-721a-42b2-a7e7-17d555f9943a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9bmr8\" (UID: \"e91d8cef-721a-42b2-a7e7-17d555f9943a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.769362 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.800809 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxcr\" (UniqueName: \"kubernetes.io/projected/65313d6f-23ee-4269-ad8c-140fb200c3e5-kube-api-access-jwxcr\") pod \"route-controller-manager-6576b87f9c-ngfwf\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.808233 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.808583 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.30855374 +0000 UTC m=+142.840521229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.808794 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-mountpoint-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.808867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f864bfbb-0015-4f97-9ebc-3775f5694fdb-metrics-tls\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.808908 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-csi-data-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.808930 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6jv\" (UniqueName: \"kubernetes.io/projected/56197cfc-90ab-486e-8812-3ac1d97b9f2a-kube-api-access-8v6jv\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.808957 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrdsn\" (UniqueName: \"kubernetes.io/projected/f864bfbb-0015-4f97-9ebc-3775f5694fdb-kube-api-access-lrdsn\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.808989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-registration-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.809037 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-plugins-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.809069 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg4xv\" (UniqueName: \"kubernetes.io/projected/581fdc11-a986-47b3-9a77-7e9ebe63b95d-kube-api-access-pg4xv\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.809139 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/581fdc11-a986-47b3-9a77-7e9ebe63b95d-certs\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.809179 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/581fdc11-a986-47b3-9a77-7e9ebe63b95d-node-bootstrap-token\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.809204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f864bfbb-0015-4f97-9ebc-3775f5694fdb-config-volume\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.809239 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.809285 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-socket-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.812123 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-plugins-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.812536 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-socket-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.812603 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f864bfbb-0015-4f97-9ebc-3775f5694fdb-config-volume\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.812816 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-csi-data-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.812858 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-mountpoint-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.812948 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.312926462 +0000 UTC m=+142.844894171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.813548 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/56197cfc-90ab-486e-8812-3ac1d97b9f2a-registration-dir\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.819056 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/581fdc11-a986-47b3-9a77-7e9ebe63b95d-certs\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.819171 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4pzg\" (UniqueName: \"kubernetes.io/projected/e4b6fb62-3f71-480f-b283-3da1fe2b63b5-kube-api-access-w4pzg\") pod \"apiserver-76f77b778f-wgct9\" (UID: \"e4b6fb62-3f71-480f-b283-3da1fe2b63b5\") " pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.819671 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/581fdc11-a986-47b3-9a77-7e9ebe63b95d-node-bootstrap-token\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.821991 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f864bfbb-0015-4f97-9ebc-3775f5694fdb-metrics-tls\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.836139 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g92zd\" (UniqueName: \"kubernetes.io/projected/23e92368-60d2-4a4c-b8ba-0c2464bb99e7-kube-api-access-g92zd\") pod \"multus-admission-controller-857f4d67dd-nnhgl\" (UID: \"23e92368-60d2-4a4c-b8ba-0c2464bb99e7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.837181 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-serving-cert\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.840293 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7f00b5d-8d8f-4246-a5c2-da2181451a39-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wggqg\" (UID: \"f7f00b5d-8d8f-4246-a5c2-da2181451a39\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.856530 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7dpw\" (UniqueName: \"kubernetes.io/projected/aa4f766f-2388-46bb-8738-e09b42f189c6-kube-api-access-l7dpw\") pod \"olm-operator-6b444d44fb-rrgs2\" (UID: \"aa4f766f-2388-46bb-8738-e09b42f189c6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.871080 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkn6r\" (UniqueName: \"kubernetes.io/projected/2dea93b7-2762-4b9c-ac3b-c7980070faf1-kube-api-access-kkn6r\") pod \"service-ca-operator-777779d784-w8rxx\" (UID: \"2dea93b7-2762-4b9c-ac3b-c7980070faf1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.892250 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjg9s\" (UniqueName: \"kubernetes.io/projected/2d687ee0-5957-4927-98ae-9a7ecdcd29c7-kube-api-access-rjg9s\") pod \"apiserver-7bbb656c7d-9qvzk\" (UID: \"2d687ee0-5957-4927-98ae-9a7ecdcd29c7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.910836 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:56 crc kubenswrapper[4907]: E1009 19:30:56.911349 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.411328812 +0000 UTC m=+142.943296301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.918250 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqhtn\" (UniqueName: \"kubernetes.io/projected/c6107861-4366-4081-8f67-3f23d66590f2-kube-api-access-qqhtn\") pod \"kube-storage-version-migrator-operator-b67b599dd-bz764\" (UID: \"c6107861-4366-4081-8f67-3f23d66590f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.940974 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sj6r\" (UniqueName: \"kubernetes.io/projected/c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9-kube-api-access-2sj6r\") pod \"authentication-operator-69f744f599-ntw5v\" (UID: \"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.961794 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7btvl\" (UniqueName: \"kubernetes.io/projected/697f3940-9b2a-4fbc-9e99-933a3645bfc3-kube-api-access-7btvl\") pod \"cluster-samples-operator-665b6dd947-fn8wz\" (UID: \"697f3940-9b2a-4fbc-9e99-933a3645bfc3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.964028 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.975198 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.980486 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" event={"ID":"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79","Type":"ContainerStarted","Data":"e0df03f9816375e9304fd9c8c3a77c69bde110a2eaf019bdea3454dd0fdeff7b"} Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.980552 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" event={"ID":"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79","Type":"ContainerStarted","Data":"92d93cba88f43252dd356489c1d3aa956e8d09791d3cb6b5e37062860bd3bd08"} Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.980714 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.993943 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpcdf\" (UniqueName: \"kubernetes.io/projected/b7a19a90-b711-4db3-9d80-a111398d62d0-kube-api-access-lpcdf\") pod \"service-ca-9c57cc56f-nl9hz\" (UID: \"b7a19a90-b711-4db3-9d80-a111398d62d0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:56 crc kubenswrapper[4907]: I1009 19:30:56.999841 4907 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hztkx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:56.999932 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" podUID="efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.010821 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qs6lp" event={"ID":"df82c6a7-c94c-4ed8-9034-45c7515ce78a","Type":"ContainerStarted","Data":"18050f47a848b43ce6e661fdb741fb08da401f807227585f74f3c6070782effa"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.013950 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.014005 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-99plh"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.014423 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.014853 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.514836304 +0000 UTC m=+143.046803793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.021036 4907 patch_prober.go:28] interesting pod/console-operator-58897d9998-qs6lp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.021089 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qs6lp" podUID="df82c6a7-c94c-4ed8-9034-45c7515ce78a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.029538 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" event={"ID":"a7cd2efb-c621-4ea7-b1d2-ea923968f737","Type":"ContainerStarted","Data":"68550b329fde2ed6a9bb2d12661612e0776d940398aa27d2abcb7200bfaec207"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.030620 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg4xv\" (UniqueName: \"kubernetes.io/projected/581fdc11-a986-47b3-9a77-7e9ebe63b95d-kube-api-access-pg4xv\") pod \"machine-config-server-zc7nw\" (UID: \"581fdc11-a986-47b3-9a77-7e9ebe63b95d\") " pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.037161 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6jv\" (UniqueName: \"kubernetes.io/projected/56197cfc-90ab-486e-8812-3ac1d97b9f2a-kube-api-access-8v6jv\") pod \"csi-hostpathplugin-g48tm\" (UID: \"56197cfc-90ab-486e-8812-3ac1d97b9f2a\") " pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.047700 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" event={"ID":"5c696fd0-572e-4fcf-bd2b-66cda008888b","Type":"ContainerStarted","Data":"fcdf5602d765b0a817118acb3e3f83dc7ee26f8254d5b19b660aa69ecc2937f7"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.047767 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" event={"ID":"5c696fd0-572e-4fcf-bd2b-66cda008888b","Type":"ContainerStarted","Data":"33c379ddd63c72e96dffc1ca12a2606da46400a532ec962616f95dc8e8876386"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.048313 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.048317 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.056400 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6rv64"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.064208 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-754vb" event={"ID":"fc36d689-70da-40c4-93ae-f5e35e414999","Type":"ContainerStarted","Data":"62039c6357628cf1f95a12196f5ba3adfbd5981262900d5cb1ba6fdbaff983a8"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.064691 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.072293 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrdsn\" (UniqueName: \"kubernetes.io/projected/f864bfbb-0015-4f97-9ebc-3775f5694fdb-kube-api-access-lrdsn\") pod \"dns-default-h9vb8\" (UID: \"f864bfbb-0015-4f97-9ebc-3775f5694fdb\") " pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.083904 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2fnwq" event={"ID":"957d72db-4cb4-4e97-bb11-2f25eb03f259","Type":"ContainerStarted","Data":"137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.083985 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2fnwq" event={"ID":"957d72db-4cb4-4e97-bb11-2f25eb03f259","Type":"ContainerStarted","Data":"e1e6183e65f573fa0745f6134a0b61cc0f5e982a080b33c037faebcac17adf4f"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.106204 4907 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-m47fb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.106289 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" podUID="5c696fd0-572e-4fcf-bd2b-66cda008888b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.106383 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.106874 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.106962 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.113762 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" event={"ID":"a44e3ba1-e673-4377-a492-ee70dfac0406","Type":"ContainerStarted","Data":"55d7ebac80fca645f6717af1d85da9c85a62803627157156f67701844507bf81"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.113831 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" event={"ID":"a44e3ba1-e673-4377-a492-ee70dfac0406","Type":"ContainerStarted","Data":"3b58e04990bb94762bf476f919c3668e0e52f5ba12c426722c54948df2a9c49c"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.124226 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.124761 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.126968 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.626918245 +0000 UTC m=+143.158885964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.127446 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.127522 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" event={"ID":"92c30e77-e0cc-4e3a-a38e-0856daffccd2","Type":"ContainerStarted","Data":"e5550666300482b3c99f56cdc7de3e98b4ef974b371b1ae845b3d32a8d7554e8"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.130336 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.133005 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" event={"ID":"86eff4e6-938a-48fa-a116-c46597bc0868","Type":"ContainerStarted","Data":"d8c65f0f063869102a45f0ed618e542491a340d08a3ef6ca3b553809bb0519e4"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.135839 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-94nv4"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.139074 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.139809 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" event={"ID":"9f6cd201-6a64-4058-9e26-946d60f89c38","Type":"ContainerStarted","Data":"aaa177130b5732af7de2227c6b19b39ec204580808f567ad4d7e2538fb932741"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.158649 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zc7nw" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.178542 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" event={"ID":"fd3641da-8c41-4082-8d46-a2590ad60dbd","Type":"ContainerStarted","Data":"f6ea0dd76e95e298f8975d40b14e09a5aeee537216be929b7eeeb8c0a705f402"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.178583 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" event={"ID":"fd3641da-8c41-4082-8d46-a2590ad60dbd","Type":"ContainerStarted","Data":"687286f7166b56a9ec80ac95dd300612b71d18b5de4f417345036b91f1fa9d9c"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.179218 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" event={"ID":"471c7823-e3ea-4b73-9034-3ba5fc123190","Type":"ContainerStarted","Data":"485e4b422eb407511eaa8784611f9f3f8a736f139dfd94c8aab681a3676f98b5"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.180709 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-g48tm" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.184669 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2hbf2" event={"ID":"1accac4f-68a4-4bf9-92a6-06d8c1c36db9","Type":"ContainerStarted","Data":"a1bb0beb1730cf8504874bd21d83c50ce4b972806f1b513c8b0002cd514b3f32"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.186449 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.188812 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h9vb8" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.210279 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" event={"ID":"532e3657-8f58-4b6b-8b4d-6d0d6a49d451","Type":"ContainerStarted","Data":"635cfe68a3b291f26ad26050e42f16adcbd75d3f8c7df60137d21ce252eb863e"} Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.222433 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.227017 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.234049 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.734033009 +0000 UTC m=+143.266000498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: W1009 19:30:57.297902 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04db531f_adc2_4158_8d64_109774b8115e.slice/crio-e9d6bc3041691fe21059fa19cbb00a771767da370752afdea7e3d8d240341322 WatchSource:0}: Error finding container e9d6bc3041691fe21059fa19cbb00a771767da370752afdea7e3d8d240341322: Status 404 returned error can't find the container with id e9d6bc3041691fe21059fa19cbb00a771767da370752afdea7e3d8d240341322 Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.332727 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.334028 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.83401077 +0000 UTC m=+143.365978259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.434142 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.434867 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:57.934840833 +0000 UTC m=+143.466808322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.444516 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.449964 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.466723 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.468530 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lqqp7"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.515189 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.538299 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.538496 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.538666 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.538709 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.038668183 +0000 UTC m=+143.570635672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.547289 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-default-certificate\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.552763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ef0fdd5-8e73-4320-8021-e6f28b26f248-metrics-certs\") pod \"router-default-5444994796-mzwdh\" (UID: \"4ef0fdd5-8e73-4320-8021-e6f28b26f248\") " pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:57 crc kubenswrapper[4907]: W1009 19:30:57.612092 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38670a3a_3b58_403a_92e0_f14d5dda51f3.slice/crio-9bdc38ee1078465542790587ad52ca005d34d6f99401790d952f7f3fe6dbcdb9 WatchSource:0}: Error finding container 9bdc38ee1078465542790587ad52ca005d34d6f99401790d952f7f3fe6dbcdb9: Status 404 returned error can't find the container with id 9bdc38ee1078465542790587ad52ca005d34d6f99401790d952f7f3fe6dbcdb9 Oct 09 19:30:57 crc kubenswrapper[4907]: W1009 19:30:57.636614 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4150c40f_0b19_4f81_b11c_6b19b25922b1.slice/crio-e8b1e1514cc6f4012c0c09bb5ab841c4b9c30596d7fec1acda1442156ec2dc56 WatchSource:0}: Error finding container e8b1e1514cc6f4012c0c09bb5ab841c4b9c30596d7fec1acda1442156ec2dc56: Status 404 returned error can't find the container with id e8b1e1514cc6f4012c0c09bb5ab841c4b9c30596d7fec1acda1442156ec2dc56 Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.640777 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.641142 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.141125977 +0000 UTC m=+143.673093466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: W1009 19:30:57.661602 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02b8e549_abd9_4adb_a77a_f2af6305625a.slice/crio-d73e154a10164d4d80cd544ff80ea85ed29d1bd294165feac1b6c02bf6064024 WatchSource:0}: Error finding container d73e154a10164d4d80cd544ff80ea85ed29d1bd294165feac1b6c02bf6064024: Status 404 returned error can't find the container with id d73e154a10164d4d80cd544ff80ea85ed29d1bd294165feac1b6c02bf6064024 Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.686089 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.714353 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.738131 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.738184 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ntw5v"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.738196 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.749764 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.750402 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.250364526 +0000 UTC m=+143.782332015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.830769 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf"] Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.851226 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.851632 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.351620298 +0000 UTC m=+143.883587787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: W1009 19:30:57.883912 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23113625_7519_44a3_b330_50463a4800c4.slice/crio-c83cb502499a20ad8d440ca494bb326ee70a4d8984ec40c2b2b704272fd332c6 WatchSource:0}: Error finding container c83cb502499a20ad8d440ca494bb326ee70a4d8984ec40c2b2b704272fd332c6: Status 404 returned error can't find the container with id c83cb502499a20ad8d440ca494bb326ee70a4d8984ec40c2b2b704272fd332c6 Oct 09 19:30:57 crc kubenswrapper[4907]: W1009 19:30:57.944828 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65313d6f_23ee_4269_ad8c_140fb200c3e5.slice/crio-d95aa5087e5890e84a524f286b89c9beffcdfaa267977044efc5307ef0a89b2a WatchSource:0}: Error finding container d95aa5087e5890e84a524f286b89c9beffcdfaa267977044efc5307ef0a89b2a: Status 404 returned error can't find the container with id d95aa5087e5890e84a524f286b89c9beffcdfaa267977044efc5307ef0a89b2a Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.958137 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:57 crc kubenswrapper[4907]: E1009 19:30:57.958927 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.458907667 +0000 UTC m=+143.990875156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:57 crc kubenswrapper[4907]: I1009 19:30:57.981986 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nl9hz"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.062359 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.062869 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.562853359 +0000 UTC m=+144.094820848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.133058 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.166572 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.167224 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.667197792 +0000 UTC m=+144.199165281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.200838 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-nnhgl"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.239787 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.269337 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.269828 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.769813291 +0000 UTC m=+144.301780770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.289247 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.291434 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wgct9"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.298228 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" event={"ID":"92c30e77-e0cc-4e3a-a38e-0856daffccd2","Type":"ContainerStarted","Data":"48f78983419ece540e24c7829815e9f99638a075d76f1b70c679424fdf7f8f26"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.312073 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.325789 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" event={"ID":"c153b2ce-efcd-4d73-8b4f-8e67322e88c5","Type":"ContainerStarted","Data":"2ed8693a06b3f3c196d0455bc44d9ac55aba3bad1ebd34dfbd4b688b72b64286"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.336119 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" event={"ID":"92eb9688-52c0-4ba4-8a82-3f874d85e2cf","Type":"ContainerStarted","Data":"85fec867537a95af74558ebc1290a36e0144a690fc55507e548dc609f10b9b5f"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.337113 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" event={"ID":"38670a3a-3b58-403a-92e0-f14d5dda51f3","Type":"ContainerStarted","Data":"9bdc38ee1078465542790587ad52ca005d34d6f99401790d952f7f3fe6dbcdb9"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.338525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" event={"ID":"471c7823-e3ea-4b73-9034-3ba5fc123190","Type":"ContainerStarted","Data":"f05530608795fa45bedfd447839e668125a0cecada4b8219493f80f7a64a6d11"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.346491 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" event={"ID":"04db531f-adc2-4158-8d64-109774b8115e","Type":"ContainerStarted","Data":"e9d6bc3041691fe21059fa19cbb00a771767da370752afdea7e3d8d240341322"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.365544 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" event={"ID":"e91d8cef-721a-42b2-a7e7-17d555f9943a","Type":"ContainerStarted","Data":"1a5c42f2759d536384b2a5e0e7c4a7e985c1d0f58c611f3e9b967ce9d3d22d28"} Oct 09 19:30:58 crc kubenswrapper[4907]: W1009 19:30:58.369220 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7f00b5d_8d8f_4246_a5c2_da2181451a39.slice/crio-c25ef106e47807c228b7590896d5f1028b7b0978f7de8639279b79b6fccf84b9 WatchSource:0}: Error finding container c25ef106e47807c228b7590896d5f1028b7b0978f7de8639279b79b6fccf84b9: Status 404 returned error can't find the container with id c25ef106e47807c228b7590896d5f1028b7b0978f7de8639279b79b6fccf84b9 Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.370566 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.371142 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.871118596 +0000 UTC m=+144.403086085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: W1009 19:30:58.382818 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d687ee0_5957_4927_98ae_9a7ecdcd29c7.slice/crio-225dbfada0c28d76c55f2cf71e60bc91df250393039083635faddfcf7cb11a32 WatchSource:0}: Error finding container 225dbfada0c28d76c55f2cf71e60bc91df250393039083635faddfcf7cb11a32: Status 404 returned error can't find the container with id 225dbfada0c28d76c55f2cf71e60bc91df250393039083635faddfcf7cb11a32 Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.383734 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" event={"ID":"099dbc78-3133-444f-b40a-b931b090a2d9","Type":"ContainerStarted","Data":"dc501ee110249705e4cb1f1b9caec603dbd6b678c5bac1e6f583336a6095f29c"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.428806 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" event={"ID":"9f6cd201-6a64-4058-9e26-946d60f89c38","Type":"ContainerStarted","Data":"66ac209d8e7731bf257d61ce85e48e26fec50754186293ce13dc4a8a3224b718"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.430675 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" event={"ID":"4150c40f-0b19-4f81-b11c-6b19b25922b1","Type":"ContainerStarted","Data":"e8b1e1514cc6f4012c0c09bb5ab841c4b9c30596d7fec1acda1442156ec2dc56"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.437497 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" event={"ID":"32375b53-7eef-44a8-96f8-e422ff17dd63","Type":"ContainerStarted","Data":"d1ec791c2950d11b3aee821f82acc00d5c3da3c1338af1926ef14d2f5db5936a"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.443131 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" event={"ID":"b7a19a90-b711-4db3-9d80-a111398d62d0","Type":"ContainerStarted","Data":"e9c51f0bcf34cb32a15c1dbefd37c284936be7112fcc3ff344aec4e5cb2fd4ad"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.457658 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-754vb" Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.458071 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.459449 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-754vb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.459549 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-754vb" podUID="fc36d689-70da-40c4-93ae-f5e35e414999" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.480913 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" event={"ID":"2dea93b7-2762-4b9c-ac3b-c7980070faf1","Type":"ContainerStarted","Data":"24190c994fd173b22e91a86cdf5c41ec68162cd3e76e5e9425f03bc2638ff7a5"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.484289 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.485948 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:58.985928687 +0000 UTC m=+144.517896176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.490899 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" event={"ID":"532e3657-8f58-4b6b-8b4d-6d0d6a49d451","Type":"ContainerStarted","Data":"97bcd632f4c1d623ea71722f447e98e72103b8a40e3793e0ca296dde429f95ea"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.504486 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" event={"ID":"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b","Type":"ContainerStarted","Data":"220aa80aecc4c3a075dadcd849e55f2eb813d3543010c2b1ba1a2d2792daf549"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.522637 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h9vb8"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.538556 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.544072 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" event={"ID":"65313d6f-23ee-4269-ad8c-140fb200c3e5","Type":"ContainerStarted","Data":"d95aa5087e5890e84a524f286b89c9beffcdfaa267977044efc5307ef0a89b2a"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.581404 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2hbf2" event={"ID":"1accac4f-68a4-4bf9-92a6-06d8c1c36db9","Type":"ContainerStarted","Data":"3d06a037da7e844ef90ad5899ed9ab0ade0e7fe44c568a897f990662b5f69e2a"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.584839 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.585870 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.085841817 +0000 UTC m=+144.617809306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.586871 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" event={"ID":"a7cd2efb-c621-4ea7-b1d2-ea923968f737","Type":"ContainerStarted","Data":"b45cba4f3bb879e04acef598d9f4b0b9c962dcdf2d0140355cde24a7b6514093"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.590348 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" event={"ID":"568e82c4-d3cb-4501-a7f6-2343d01f0d60","Type":"ContainerStarted","Data":"a9a6e09b83967dc536ab5e16687493a645afee31f18a32aa414ba2e4d81f7746"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.619361 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zc7nw" event={"ID":"581fdc11-a986-47b3-9a77-7e9ebe63b95d","Type":"ContainerStarted","Data":"ec842f99ebcc3446df948cbafc59661ef8e63bd31b887ebd6a0d55d2cb722be4"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.655694 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" event={"ID":"02b8e549-abd9-4adb-a77a-f2af6305625a","Type":"ContainerStarted","Data":"d73e154a10164d4d80cd544ff80ea85ed29d1bd294165feac1b6c02bf6064024"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.655782 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" event={"ID":"86eff4e6-938a-48fa-a116-c46597bc0868","Type":"ContainerStarted","Data":"ecc416dcb1792cb9d8173232b33a63f5671bfd94f2df0f527d6ddd225f51c7f6"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.674381 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" podStartSLOduration=123.674351234 podStartE2EDuration="2m3.674351234s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:58.669994032 +0000 UTC m=+144.201961531" watchObservedRunningTime="2025-10-09 19:30:58.674351234 +0000 UTC m=+144.206318723" Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.678833 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" event={"ID":"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9","Type":"ContainerStarted","Data":"cadc6f80a445d7fb6e3141c15e6465d2392b19ec2c0b39023e2f24c5bf4913ea"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.681525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" event={"ID":"23113625-7519-44a3-b330-50463a4800c4","Type":"ContainerStarted","Data":"c83cb502499a20ad8d440ca494bb326ee70a4d8984ec40c2b2b704272fd332c6"} Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.690272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.692798 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.192745855 +0000 UTC m=+144.724713344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.695953 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.697453 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.735021 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qs6lp" Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.745594 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-g48tm"] Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.793900 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.795928 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.295910098 +0000 UTC m=+144.827877587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.898567 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:58 crc kubenswrapper[4907]: E1009 19:30:58.898939 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.398927557 +0000 UTC m=+144.930895046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.929563 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-2fnwq" podStartSLOduration=124.929543691 podStartE2EDuration="2m4.929543691s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:58.926226106 +0000 UTC m=+144.458193605" watchObservedRunningTime="2025-10-09 19:30:58.929543691 +0000 UTC m=+144.461511180" Oct 09 19:30:58 crc kubenswrapper[4907]: I1009 19:30:58.954795 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qs6lp" podStartSLOduration=124.954774697 podStartE2EDuration="2m4.954774697s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:58.953045973 +0000 UTC m=+144.485013472" watchObservedRunningTime="2025-10-09 19:30:58.954774697 +0000 UTC m=+144.486742176" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.000122 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.000536 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.500518149 +0000 UTC m=+145.032485648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.014746 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hhfns" podStartSLOduration=124.014727423 podStartE2EDuration="2m4.014727423s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.011293065 +0000 UTC m=+144.543260554" watchObservedRunningTime="2025-10-09 19:30:59.014727423 +0000 UTC m=+144.546694902" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.104623 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.105382 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.605367675 +0000 UTC m=+145.137335154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.121364 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bsxgx" podStartSLOduration=124.121345904 podStartE2EDuration="2m4.121345904s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.074441763 +0000 UTC m=+144.606409242" watchObservedRunningTime="2025-10-09 19:30:59.121345904 +0000 UTC m=+144.653313393" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.171042 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" podStartSLOduration=124.171020607 podStartE2EDuration="2m4.171020607s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.164998863 +0000 UTC m=+144.696966362" watchObservedRunningTime="2025-10-09 19:30:59.171020607 +0000 UTC m=+144.702988096" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.204782 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-wz5xt" podStartSLOduration=124.204761501 podStartE2EDuration="2m4.204761501s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.201945579 +0000 UTC m=+144.733913078" watchObservedRunningTime="2025-10-09 19:30:59.204761501 +0000 UTC m=+144.736728990" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.211685 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.212236 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.712209902 +0000 UTC m=+145.244177391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.212310 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.212825 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.712817448 +0000 UTC m=+145.244784937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.322494 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.323168 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.823150484 +0000 UTC m=+145.355117973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.432978 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.433880 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:30:59.93385647 +0000 UTC m=+145.465823959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.458574 4907 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6vslh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.458635 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" podUID="9f6cd201-6a64-4058-9e26-946d60f89c38" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.465640 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqls2" podStartSLOduration=125.465628004 podStartE2EDuration="2m5.465628004s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.463986552 +0000 UTC m=+144.995954061" watchObservedRunningTime="2025-10-09 19:30:59.465628004 +0000 UTC m=+144.997595493" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.539456 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.539982 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.039960878 +0000 UTC m=+145.571928367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.617582 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" podStartSLOduration=59.617548225 podStartE2EDuration="59.617548225s" podCreationTimestamp="2025-10-09 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.615349959 +0000 UTC m=+145.147317448" watchObservedRunningTime="2025-10-09 19:30:59.617548225 +0000 UTC m=+145.149515714" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.644223 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.644932 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.144912706 +0000 UTC m=+145.676880195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.678812 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-754vb" podStartSLOduration=124.678793224 podStartE2EDuration="2m4.678793224s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.672649427 +0000 UTC m=+145.204616916" watchObservedRunningTime="2025-10-09 19:30:59.678793224 +0000 UTC m=+145.210760713" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.718027 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" event={"ID":"5ecfaee0-5af7-4d8e-8ae1-fb923602c05b","Type":"ContainerStarted","Data":"146a6e6c3c0d631d6d618253dd58e43e23b8b1bdc59248d46f89b4a9b6fc36fc"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.731305 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" podStartSLOduration=124.731289239 podStartE2EDuration="2m4.731289239s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.728933459 +0000 UTC m=+145.260900948" watchObservedRunningTime="2025-10-09 19:30:59.731289239 +0000 UTC m=+145.263256728" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.747983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g48tm" event={"ID":"56197cfc-90ab-486e-8812-3ac1d97b9f2a","Type":"ContainerStarted","Data":"93ee4f2cc01d5600ba96bf75c99b6d9bb6547e51d63d3e3729e539af129f4ace"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.750083 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.750541 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.250525212 +0000 UTC m=+145.782492701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.776848 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" podStartSLOduration=124.776826535 podStartE2EDuration="2m4.776826535s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.771173131 +0000 UTC m=+145.303140630" watchObservedRunningTime="2025-10-09 19:30:59.776826535 +0000 UTC m=+145.308794024" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.818120 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" event={"ID":"32375b53-7eef-44a8-96f8-e422ff17dd63","Type":"ContainerStarted","Data":"644ddfb257fac6b547a56f654a56fa37314c31f83f4a8b4279535915b63d2921"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.844685 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" event={"ID":"92eb9688-52c0-4ba4-8a82-3f874d85e2cf","Type":"ContainerStarted","Data":"752d4c754c30fbb378d359ec3690e73283c5e0aec792559042ad2d540303decf"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.845500 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.851134 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6rv64 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.851184 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" podUID="92eb9688-52c0-4ba4-8a82-3f874d85e2cf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.852240 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.853531 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.35351806 +0000 UTC m=+145.885485549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.872377 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2hbf2" podStartSLOduration=6.872358513 podStartE2EDuration="6.872358513s" podCreationTimestamp="2025-10-09 19:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.821164201 +0000 UTC m=+145.353131690" watchObservedRunningTime="2025-10-09 19:30:59.872358513 +0000 UTC m=+145.404326002" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.873656 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q95gm" event={"ID":"02b8e549-abd9-4adb-a77a-f2af6305625a","Type":"ContainerStarted","Data":"a99538ab45b0a6969270ebd2452b4a7c11518e9dbe5b2cc9c9d134ee447cc5f2"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.874801 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" event={"ID":"2d687ee0-5957-4927-98ae-9a7ecdcd29c7","Type":"ContainerStarted","Data":"225dbfada0c28d76c55f2cf71e60bc91df250393039083635faddfcf7cb11a32"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.876019 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" event={"ID":"c514b0d4-4b2a-44a4-8610-20d2cd4ecaf9","Type":"ContainerStarted","Data":"f42649600f2c8794028a7831c2b39decd1d890811a7fa0b6df9823710ccf5192"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.892301 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" event={"ID":"e4b6fb62-3f71-480f-b283-3da1fe2b63b5","Type":"ContainerStarted","Data":"fb168eef82e981551dea4b09dcb40a5647c91d117c6a74f7205216c303672d7c"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.895497 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" event={"ID":"aa4f766f-2388-46bb-8738-e09b42f189c6","Type":"ContainerStarted","Data":"1faa86a57ee99c25c371c25340cd73dd190ce1b6e768d831a00b69da54d58e94"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.896804 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" event={"ID":"f7f00b5d-8d8f-4246-a5c2-da2181451a39","Type":"ContainerStarted","Data":"c25ef106e47807c228b7590896d5f1028b7b0978f7de8639279b79b6fccf84b9"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.900221 4907 generic.go:334] "Generic (PLEG): container finished" podID="04db531f-adc2-4158-8d64-109774b8115e" containerID="7681e4bbb30cf3b3759815059e21e8fd42a640d74436824ab07d2b0d693e54c2" exitCode=0 Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.900332 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" event={"ID":"04db531f-adc2-4158-8d64-109774b8115e","Type":"ContainerDied","Data":"7681e4bbb30cf3b3759815059e21e8fd42a640d74436824ab07d2b0d693e54c2"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.910974 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" event={"ID":"697f3940-9b2a-4fbc-9e99-933a3645bfc3","Type":"ContainerStarted","Data":"dc3de556f9dab83873ea70366f01a7307a9555e1af279ff94fa92fb013e70039"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.911476 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" event={"ID":"697f3940-9b2a-4fbc-9e99-933a3645bfc3","Type":"ContainerStarted","Data":"03bfb8960d3b5e0454404ffe0a7842242d6c0b47ccf7d4a7ce8ccf321747352f"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.917250 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-58hw7" podStartSLOduration=124.917229332 podStartE2EDuration="2m4.917229332s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.916936055 +0000 UTC m=+145.448903544" watchObservedRunningTime="2025-10-09 19:30:59.917229332 +0000 UTC m=+145.449196831" Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.955165 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:30:59 crc kubenswrapper[4907]: E1009 19:30:59.956544 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.456530409 +0000 UTC m=+145.988497888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.972633 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" event={"ID":"38670a3a-3b58-403a-92e0-f14d5dda51f3","Type":"ContainerStarted","Data":"4c57a6ad06df6e5b534d847cb1b916b83ddda9495cbfa10e8494e335e267b016"} Oct 09 19:30:59 crc kubenswrapper[4907]: I1009 19:30:59.993678 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" event={"ID":"e91d8cef-721a-42b2-a7e7-17d555f9943a","Type":"ContainerStarted","Data":"cc6016ef6cc5c95f1aca5dba200d51310ae1db2e4d7fde9c8bdb1dbd7623ee7b"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.007878 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" podStartSLOduration=125.007850164 podStartE2EDuration="2m5.007850164s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:30:59.951665624 +0000 UTC m=+145.483633123" watchObservedRunningTime="2025-10-09 19:31:00.007850164 +0000 UTC m=+145.539817653" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.041666 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ntw5v" podStartSLOduration=125.041641489 podStartE2EDuration="2m5.041641489s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.040254394 +0000 UTC m=+145.572221893" watchObservedRunningTime="2025-10-09 19:31:00.041641489 +0000 UTC m=+145.573608978" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.056875 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.059064 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.559046425 +0000 UTC m=+146.091013914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.071844 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" event={"ID":"65313d6f-23ee-4269-ad8c-140fb200c3e5","Type":"ContainerStarted","Data":"fc4878719368c7642a684a882d60430af735763ccffc25a466d27ae17994fc3b"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.073112 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.091807 4907 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ngfwf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.092416 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" podUID="65313d6f-23ee-4269-ad8c-140fb200c3e5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.092885 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" event={"ID":"23e92368-60d2-4a4c-b8ba-0c2464bb99e7","Type":"ContainerStarted","Data":"372dd8f9b01f7ac430a90f28ade7f79cd3d8ad74d0e9f0e42dc1f7756223cdd5"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.127502 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" event={"ID":"c6107861-4366-4081-8f67-3f23d66590f2","Type":"ContainerStarted","Data":"aca4b5002ed13623130dbddd98768ce7f9fec47077c87e9d41833382d1116e51"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.139680 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mzwdh" event={"ID":"4ef0fdd5-8e73-4320-8021-e6f28b26f248","Type":"ContainerStarted","Data":"6ee27841df5b64a1be4a3ddbddb917919f9d3b69a5aaf8c8c52ded13e84868a7"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.139753 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mzwdh" event={"ID":"4ef0fdd5-8e73-4320-8021-e6f28b26f248","Type":"ContainerStarted","Data":"e913fb97331eac6672cf7ef65b69b2899245d9ba15879d42d5b4e36d15503c6c"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.161977 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.163392 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.663376198 +0000 UTC m=+146.195343687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.178856 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9bmr8" podStartSLOduration=125.178830523 podStartE2EDuration="2m5.178830523s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.127799916 +0000 UTC m=+145.659767405" watchObservedRunningTime="2025-10-09 19:31:00.178830523 +0000 UTC m=+145.710798012" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.180105 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" event={"ID":"92c30e77-e0cc-4e3a-a38e-0856daffccd2","Type":"ContainerStarted","Data":"cbcac40e1d996105717f5c393b9e87876e9a49b0529afdf1f4cb6874cb032f8b"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.224126 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" podStartSLOduration=125.224100323 podStartE2EDuration="2m5.224100323s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.183571085 +0000 UTC m=+145.715538584" watchObservedRunningTime="2025-10-09 19:31:00.224100323 +0000 UTC m=+145.756067812" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.236988 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" event={"ID":"23113625-7519-44a3-b330-50463a4800c4","Type":"ContainerStarted","Data":"81310c1c77c217bfdcd1c2db91103c4d321cabd7a7a8fb49c1960c3e37434ce1"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.284460 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.288513 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.788490802 +0000 UTC m=+146.320458291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.307954 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zp9bm" podStartSLOduration=125.30791706 podStartE2EDuration="2m5.30791706s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.227362017 +0000 UTC m=+145.759329506" watchObservedRunningTime="2025-10-09 19:31:00.30791706 +0000 UTC m=+145.839884549" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.308286 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mzwdh" podStartSLOduration=125.308274669 podStartE2EDuration="2m5.308274669s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.291356086 +0000 UTC m=+145.823323585" watchObservedRunningTime="2025-10-09 19:31:00.308274669 +0000 UTC m=+145.840242158" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.314814 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9vb8" event={"ID":"f864bfbb-0015-4f97-9ebc-3775f5694fdb","Type":"ContainerStarted","Data":"c44bcb227caaac91c176cabf901701712c7812a285d9955945d032d3618c8f2f"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.365578 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5l6b5" podStartSLOduration=125.365551037 podStartE2EDuration="2m5.365551037s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.34382428 +0000 UTC m=+145.875791779" watchObservedRunningTime="2025-10-09 19:31:00.365551037 +0000 UTC m=+145.897518526" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.367273 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" event={"ID":"568e82c4-d3cb-4501-a7f6-2343d01f0d60","Type":"ContainerStarted","Data":"bbfd2259e8dc130bfabd81bfedb598ba6d8f1d24deab4db55cf21262ebb8be82"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.367356 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" event={"ID":"568e82c4-d3cb-4501-a7f6-2343d01f0d60","Type":"ContainerStarted","Data":"772881c4114ae75a1b77ee5570e846edfc8fc62b3f6be93908470881c2f5f938"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.367778 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.398573 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.400021 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:00.900001049 +0000 UTC m=+146.431968538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.430184 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zc7nw" event={"ID":"581fdc11-a986-47b3-9a77-7e9ebe63b95d","Type":"ContainerStarted","Data":"4b8f7053e0b40235065099cff3d63b4c5183ba4f1209dde90c7a8da9623859a9"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.435061 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" event={"ID":"099dbc78-3133-444f-b40a-b931b090a2d9","Type":"ContainerStarted","Data":"8cf7b49096b50815465430a2c21803f4866da2a70049a0dcf0d92867c6385392"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.466585 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" podStartSLOduration=125.466562534 podStartE2EDuration="2m5.466562534s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.46092402 +0000 UTC m=+145.992891509" watchObservedRunningTime="2025-10-09 19:31:00.466562534 +0000 UTC m=+145.998530023" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.471871 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-754vb" event={"ID":"fc36d689-70da-40c4-93ae-f5e35e414999","Type":"ContainerStarted","Data":"7f77bc2f9b939978098b267469e2929af8b59334222095f7bdf21688c680c788"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.474794 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-754vb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.474948 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-754vb" podUID="fc36d689-70da-40c4-93ae-f5e35e414999" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.496086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" event={"ID":"4150c40f-0b19-4f81-b11c-6b19b25922b1","Type":"ContainerStarted","Data":"241c463fd0f436f8271673f5d7599cb1d9397ae3d8b0f95e64ec532a7e0e4308"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.499959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.500392 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.00037856 +0000 UTC m=+146.532346049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.507328 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" event={"ID":"2dea93b7-2762-4b9c-ac3b-c7980070faf1","Type":"ContainerStarted","Data":"985197980b69460bd68ed0fca2cff5112ee742cce16f098282269c10a1e82ba8"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.524726 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" event={"ID":"c153b2ce-efcd-4d73-8b4f-8e67322e88c5","Type":"ContainerStarted","Data":"ca7879c2b6a286bb69f313f0476f97ec62059e250cc32609ce16a08537d96fbf"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.526100 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.542773 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" event={"ID":"fd3641da-8c41-4082-8d46-a2590ad60dbd","Type":"ContainerStarted","Data":"f068d90b72d996699c6ee1714fbc12d82f0e51b0a88d4ef6a9d1fdd7990c1b74"} Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.546974 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.569653 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6vslh" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.576955 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8rxx" podStartSLOduration=125.576926051 podStartE2EDuration="2m5.576926051s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.576617583 +0000 UTC m=+146.108585082" watchObservedRunningTime="2025-10-09 19:31:00.576926051 +0000 UTC m=+146.108893550" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.577294 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zc7nw" podStartSLOduration=7.57728641 podStartE2EDuration="7.57728641s" podCreationTimestamp="2025-10-09 19:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.515164449 +0000 UTC m=+146.047131938" watchObservedRunningTime="2025-10-09 19:31:00.57728641 +0000 UTC m=+146.109253909" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.600570 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.602194 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.102174408 +0000 UTC m=+146.634141897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.713341 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.716895 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.216875506 +0000 UTC m=+146.748843225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.734285 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.752239 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:00 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:00 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:00 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.752324 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.754086 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nxgdx" podStartSLOduration=125.754073599 podStartE2EDuration="2m5.754073599s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.747785658 +0000 UTC m=+146.279753147" watchObservedRunningTime="2025-10-09 19:31:00.754073599 +0000 UTC m=+146.286041088" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.815800 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.815913 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.315893083 +0000 UTC m=+146.847860572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.816204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.816609 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.316599961 +0000 UTC m=+146.848567440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.833066 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" podStartSLOduration=125.833047362 podStartE2EDuration="2m5.833047362s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:00.831109073 +0000 UTC m=+146.363076562" watchObservedRunningTime="2025-10-09 19:31:00.833047362 +0000 UTC m=+146.365014851" Oct 09 19:31:00 crc kubenswrapper[4907]: I1009 19:31:00.919096 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:00 crc kubenswrapper[4907]: E1009 19:31:00.919609 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.419590549 +0000 UTC m=+146.951558048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.024254 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.025052 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.52502437 +0000 UTC m=+147.056992049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.126259 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.126906 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.626891439 +0000 UTC m=+147.158858928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.228258 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.228743 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.728725018 +0000 UTC m=+147.260692507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.237086 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7t9j5"] Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.238300 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.244326 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.303020 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7t9j5"] Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.332822 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.333429 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2b7k\" (UniqueName: \"kubernetes.io/projected/84707a79-5b88-454b-9e1f-5618515a5623-kube-api-access-d2b7k\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.335561 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-catalog-content\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.335615 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-utilities\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.335834 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.835806131 +0000 UTC m=+147.367773610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.397863 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b5b4h"] Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.398973 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.406761 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.422562 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5b4h"] Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.439356 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2b7k\" (UniqueName: \"kubernetes.io/projected/84707a79-5b88-454b-9e1f-5618515a5623-kube-api-access-d2b7k\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.439444 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-catalog-content\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.439490 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-utilities\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.439519 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.440076 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:01.940056101 +0000 UTC m=+147.472023590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.441166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-catalog-content\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.441412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-utilities\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.501324 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2b7k\" (UniqueName: \"kubernetes.io/projected/84707a79-5b88-454b-9e1f-5618515a5623-kube-api-access-d2b7k\") pod \"community-operators-7t9j5\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.550456 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.550684 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcxwm\" (UniqueName: \"kubernetes.io/projected/bf99f768-d09e-4105-9150-39b510795216-kube-api-access-dcxwm\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.550914 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-utilities\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.550955 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-catalog-content\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.551077 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.051060944 +0000 UTC m=+147.583028433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.586925 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" event={"ID":"c6107861-4366-4081-8f67-3f23d66590f2","Type":"ContainerStarted","Data":"d2b7b9c6626b488cde7ee5905031e5f98828d481b22b8d5ead0b2e8b4b55dea4"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.587057 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.608604 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vchbq"] Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.610150 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.612261 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" event={"ID":"b7a19a90-b711-4db3-9d80-a111398d62d0","Type":"ContainerStarted","Data":"06d7f6fc49d6d30c6e697643287886284b3e0edd1fbcb200bdb39f8224497370"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.627021 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bz764" podStartSLOduration=126.626994659 podStartE2EDuration="2m6.626994659s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:01.624655849 +0000 UTC m=+147.156623348" watchObservedRunningTime="2025-10-09 19:31:01.626994659 +0000 UTC m=+147.158962148" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.632290 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" event={"ID":"697f3940-9b2a-4fbc-9e99-933a3645bfc3","Type":"ContainerStarted","Data":"ad9d0b690177fc17e92e759f4144389c6440950b278a9a65645697fe157d3105"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.646249 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vchbq"] Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.646951 4907 generic.go:334] "Generic (PLEG): container finished" podID="e4b6fb62-3f71-480f-b283-3da1fe2b63b5" containerID="23d59603065dbebb64ea9a848744f9f04c1385919740bcaea9a199ddb7a626a0" exitCode=0 Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.648681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" event={"ID":"e4b6fb62-3f71-480f-b283-3da1fe2b63b5","Type":"ContainerDied","Data":"23d59603065dbebb64ea9a848744f9f04c1385919740bcaea9a199ddb7a626a0"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.652997 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.653089 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-utilities\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.653127 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-catalog-content\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.653163 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcxwm\" (UniqueName: \"kubernetes.io/projected/bf99f768-d09e-4105-9150-39b510795216-kube-api-access-dcxwm\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.654254 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.154238327 +0000 UTC m=+147.686205816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.658873 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-utilities\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.660660 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-catalog-content\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.679937 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" event={"ID":"f7f00b5d-8d8f-4246-a5c2-da2181451a39","Type":"ContainerStarted","Data":"eb7d825e22cd1f3508295ea4441ec5687076d3195d5f1b7e9ed279478d04f41a"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.730059 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcxwm\" (UniqueName: \"kubernetes.io/projected/bf99f768-d09e-4105-9150-39b510795216-kube-api-access-dcxwm\") pod \"certified-operators-b5b4h\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.742508 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:01 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:01 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:01 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.742561 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.749628 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nl9hz" podStartSLOduration=126.74959974 podStartE2EDuration="2m6.74959974s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:01.744218582 +0000 UTC m=+147.276186081" watchObservedRunningTime="2025-10-09 19:31:01.74959974 +0000 UTC m=+147.281567229" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.759925 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.760223 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-utilities\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.760300 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcbq6\" (UniqueName: \"kubernetes.io/projected/61589af1-8a53-445d-afef-ff35192b01a5-kube-api-access-gcbq6\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.760408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-catalog-content\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.761391 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.261377152 +0000 UTC m=+147.793344641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.777197 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" event={"ID":"04db531f-adc2-4158-8d64-109774b8115e","Type":"ContainerStarted","Data":"d5e9d0aab4c0040b8c2f612ffa5570c0c07ce8d50c01b2e8c2dfaf66e5d3d635"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.778395 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.789520 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.812285 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fn8wz" podStartSLOduration=127.812267055 podStartE2EDuration="2m7.812267055s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:01.809139375 +0000 UTC m=+147.341106864" watchObservedRunningTime="2025-10-09 19:31:01.812267055 +0000 UTC m=+147.344234544" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.842289 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-27prd"] Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.844329 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g48tm" event={"ID":"56197cfc-90ab-486e-8812-3ac1d97b9f2a","Type":"ContainerStarted","Data":"23b8f7efbe5737adbbd3cf124783c4051e8e438e2d67bc5eb493c8ae9edd7402"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.844507 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.851751 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27prd"] Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.865302 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-utilities\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.865410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-utilities\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.865581 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcbq6\" (UniqueName: \"kubernetes.io/projected/61589af1-8a53-445d-afef-ff35192b01a5-kube-api-access-gcbq6\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.865638 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-catalog-content\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.865681 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-catalog-content\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.865786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b72p5\" (UniqueName: \"kubernetes.io/projected/cec75db6-ed33-4e33-b35c-44a59a054859-kube-api-access-b72p5\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.865829 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.866062 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-utilities\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.866279 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.366259438 +0000 UTC m=+147.898227107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.866810 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-catalog-content\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.882607 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" event={"ID":"099dbc78-3133-444f-b40a-b931b090a2d9","Type":"ContainerStarted","Data":"5492aaff3a4270f2067589263a2d6038db8cbf599beba4cfaf316f7509c8ff99"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.929803 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" event={"ID":"38670a3a-3b58-403a-92e0-f14d5dda51f3","Type":"ContainerStarted","Data":"1800cc01b438e8a2e10be17e7494b6a1a4e1f70c4b3e2975c24ace84c9e531db"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.931891 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcbq6\" (UniqueName: \"kubernetes.io/projected/61589af1-8a53-445d-afef-ff35192b01a5-kube-api-access-gcbq6\") pod \"community-operators-vchbq\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.935907 4907 generic.go:334] "Generic (PLEG): container finished" podID="2d687ee0-5957-4927-98ae-9a7ecdcd29c7" containerID="adce8d06e990079f82b8dcbf70e0023984e31b7de43676946d7520f6f7688892" exitCode=0 Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.936128 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" event={"ID":"2d687ee0-5957-4927-98ae-9a7ecdcd29c7","Type":"ContainerDied","Data":"adce8d06e990079f82b8dcbf70e0023984e31b7de43676946d7520f6f7688892"} Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.979405 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wggqg" podStartSLOduration=126.979380556 podStartE2EDuration="2m6.979380556s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:01.905383581 +0000 UTC m=+147.437351070" watchObservedRunningTime="2025-10-09 19:31:01.979380556 +0000 UTC m=+147.511348045" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.980924 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" podStartSLOduration=127.980918356 podStartE2EDuration="2m7.980918356s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:01.952352664 +0000 UTC m=+147.484320173" watchObservedRunningTime="2025-10-09 19:31:01.980918356 +0000 UTC m=+147.512885845" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.982711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.983534 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.983824 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.48381229 +0000 UTC m=+148.015779779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.983892 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b72p5\" (UniqueName: \"kubernetes.io/projected/cec75db6-ed33-4e33-b35c-44a59a054859-kube-api-access-b72p5\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.983924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.983946 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-utilities\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.984086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-catalog-content\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:01 crc kubenswrapper[4907]: E1009 19:31:01.985842 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.485833821 +0000 UTC m=+148.017801310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.986737 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-utilities\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:01 crc kubenswrapper[4907]: I1009 19:31:01.987768 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-catalog-content\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.044925 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9vb8" event={"ID":"f864bfbb-0015-4f97-9ebc-3775f5694fdb","Type":"ContainerStarted","Data":"a69bef0ae65ecc76250f7b85ed7c105dbf6bea5be9ab1c9e5d33c85cf6e20bb3"} Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.054316 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-h9vb8" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.060036 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b72p5\" (UniqueName: \"kubernetes.io/projected/cec75db6-ed33-4e33-b35c-44a59a054859-kube-api-access-b72p5\") pod \"certified-operators-27prd\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.072294 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-s9h47" podStartSLOduration=127.072270096 podStartE2EDuration="2m7.072270096s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:02.070312966 +0000 UTC m=+147.602280455" watchObservedRunningTime="2025-10-09 19:31:02.072270096 +0000 UTC m=+147.604237575" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.086530 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.090301 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.590272527 +0000 UTC m=+148.122240016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.095825 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.096507 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.596489736 +0000 UTC m=+148.128457225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.138489 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" event={"ID":"aa4f766f-2388-46bb-8738-e09b42f189c6","Type":"ContainerStarted","Data":"ba6d3001e980e7bea50e23626afb97f061adefcabe767257d5cac89c0849cf8b"} Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.141249 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.204049 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.205852 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.206876 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.207551 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.707532941 +0000 UTC m=+148.239500430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.247150 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" event={"ID":"32375b53-7eef-44a8-96f8-e422ff17dd63","Type":"ContainerStarted","Data":"6e629f34e2e4938e2231b40391e898260e999ac2ed27859141cee06c71aaa5af"} Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.324907 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.325331 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.825315398 +0000 UTC m=+148.357282887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.343091 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" event={"ID":"23e92368-60d2-4a4c-b8ba-0c2464bb99e7","Type":"ContainerStarted","Data":"fb4f3a818d5135acb46984cb36afb2a18e25569d06150389e083e0203390b65b"} Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.343295 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" event={"ID":"23e92368-60d2-4a4c-b8ba-0c2464bb99e7","Type":"ContainerStarted","Data":"9b34fb7fa4ca013b9a50283c90aea2b38a51e28787aa27bb5533832705384d86"} Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.361145 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" event={"ID":"4150c40f-0b19-4f81-b11c-6b19b25922b1","Type":"ContainerStarted","Data":"a09c9b6430a687f71c851d9a14597b15d04cc03fe6b9c02a0dfd34111f243e48"} Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.363541 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6rv64 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.363586 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" podUID="92eb9688-52c0-4ba4-8a82-3f874d85e2cf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.364133 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-754vb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.364153 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-754vb" podUID="fc36d689-70da-40c4-93ae-f5e35e414999" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.393457 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.431571 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.432032 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:02.932010651 +0000 UTC m=+148.463978140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.498131 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h9vb8" podStartSLOduration=9.498107884 podStartE2EDuration="9.498107884s" podCreationTimestamp="2025-10-09 19:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:02.488038836 +0000 UTC m=+148.020006345" watchObservedRunningTime="2025-10-09 19:31:02.498107884 +0000 UTC m=+148.030075373" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.498559 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6tz8n" podStartSLOduration=127.498552616 podStartE2EDuration="2m7.498552616s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:02.387057599 +0000 UTC m=+147.919025088" watchObservedRunningTime="2025-10-09 19:31:02.498552616 +0000 UTC m=+148.030520105" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.533668 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.540105 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.040077329 +0000 UTC m=+148.572044818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.562213 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rrgs2" podStartSLOduration=127.562195686 podStartE2EDuration="2m7.562195686s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:02.555054933 +0000 UTC m=+148.087022442" watchObservedRunningTime="2025-10-09 19:31:02.562195686 +0000 UTC m=+148.094163175" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.637876 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.638277 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.138250454 +0000 UTC m=+148.670217943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.638914 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.639214 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.139204969 +0000 UTC m=+148.671172458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.685732 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-99plh" podStartSLOduration=127.68571075 podStartE2EDuration="2m7.68571075s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:02.60373267 +0000 UTC m=+148.135700159" watchObservedRunningTime="2025-10-09 19:31:02.68571075 +0000 UTC m=+148.217678239" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.708146 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lqqp7" podStartSLOduration=127.708129154 podStartE2EDuration="2m7.708129154s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:02.707240531 +0000 UTC m=+148.239208030" watchObservedRunningTime="2025-10-09 19:31:02.708129154 +0000 UTC m=+148.240096643" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.725895 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7t9j5"] Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.726352 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:02 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:02 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:02 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.726411 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.740381 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.740879 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.240859163 +0000 UTC m=+148.772826652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.773180 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5b4h"] Oct 09 19:31:02 crc kubenswrapper[4907]: W1009 19:31:02.830122 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf99f768_d09e_4105_9150_39b510795216.slice/crio-c2b477642716e6dfe4617b6bc52a5ecf2867e3f197f43b10b07dccf5a7a8669c WatchSource:0}: Error finding container c2b477642716e6dfe4617b6bc52a5ecf2867e3f197f43b10b07dccf5a7a8669c: Status 404 returned error can't find the container with id c2b477642716e6dfe4617b6bc52a5ecf2867e3f197f43b10b07dccf5a7a8669c Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.842821 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.843163 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.343151423 +0000 UTC m=+148.875118912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:02 crc kubenswrapper[4907]: I1009 19:31:02.944324 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:02 crc kubenswrapper[4907]: E1009 19:31:02.945201 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.445185557 +0000 UTC m=+148.977153046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.047119 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.069066 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.569032879 +0000 UTC m=+149.101000358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.151291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.151443 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.65141759 +0000 UTC m=+149.183385079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.176624 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.177024 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.677010295 +0000 UTC m=+149.208977784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.285953 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.286302 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.786268144 +0000 UTC m=+149.318235643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.286749 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.286856 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.286887 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.286915 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.286956 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.287296 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.78727779 +0000 UTC m=+149.319245279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.289705 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.304794 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.325598 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.326702 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.333795 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-nnhgl" podStartSLOduration=128.333764801 podStartE2EDuration="2m8.333764801s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:02.867309852 +0000 UTC m=+148.399277351" watchObservedRunningTime="2025-10-09 19:31:03.333764801 +0000 UTC m=+148.865732290" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.334954 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-27prd"] Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.363839 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vchbq"] Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.384935 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.385021 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.388314 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.389004 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.888972855 +0000 UTC m=+149.420940344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.414885 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v9qxz"] Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.426101 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v9qxz"] Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.426668 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: W1009 19:31:03.431790 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61589af1_8a53_445d_afef_ff35192b01a5.slice/crio-448b2d7dce78ee41eec81d3d90e6a0d1a3a9a30b0dda22e564f2701aa064b337 WatchSource:0}: Error finding container 448b2d7dce78ee41eec81d3d90e6a0d1a3a9a30b0dda22e564f2701aa064b337: Status 404 returned error can't find the container with id 448b2d7dce78ee41eec81d3d90e6a0d1a3a9a30b0dda22e564f2701aa064b337 Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.435573 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h9vb8" event={"ID":"f864bfbb-0015-4f97-9ebc-3775f5694fdb","Type":"ContainerStarted","Data":"09373dfa5eb3d6351177050158a475dd505d275f56908c752eee8eb921271f6e"} Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.452590 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.490200 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-utilities\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.490239 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmskm\" (UniqueName: \"kubernetes.io/projected/a8f953fc-1eac-414d-b93e-e98eaa5aea79-kube-api-access-nmskm\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.490262 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-catalog-content\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.490301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.491811 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:03.991779178 +0000 UTC m=+149.523746857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.492226 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" event={"ID":"2d687ee0-5957-4927-98ae-9a7ecdcd29c7","Type":"ContainerStarted","Data":"5578ab924a40d2a475a3aaffd9f6a916101b56e924831f188a41617d0b3647cb"} Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.500555 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27prd" event={"ID":"cec75db6-ed33-4e33-b35c-44a59a054859","Type":"ContainerStarted","Data":"43ee5baf42a0568e986898ba144b1414f9f955480e7d42b07277f7360f63fae6"} Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.514875 4907 generic.go:334] "Generic (PLEG): container finished" podID="bf99f768-d09e-4105-9150-39b510795216" containerID="892a482aec51cacc014a29347dcab41fe770812f819db9eda511d9a578a806be" exitCode=0 Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.514957 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5b4h" event={"ID":"bf99f768-d09e-4105-9150-39b510795216","Type":"ContainerDied","Data":"892a482aec51cacc014a29347dcab41fe770812f819db9eda511d9a578a806be"} Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.514990 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5b4h" event={"ID":"bf99f768-d09e-4105-9150-39b510795216","Type":"ContainerStarted","Data":"c2b477642716e6dfe4617b6bc52a5ecf2867e3f197f43b10b07dccf5a7a8669c"} Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.524271 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" podStartSLOduration=128.524065056 podStartE2EDuration="2m8.524065056s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:03.521737776 +0000 UTC m=+149.053705255" watchObservedRunningTime="2025-10-09 19:31:03.524065056 +0000 UTC m=+149.056032545" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.526515 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" event={"ID":"e4b6fb62-3f71-480f-b283-3da1fe2b63b5","Type":"ContainerStarted","Data":"2b49065578afdd9af7284ac535410d71ef5bbfab14d464512388440a94771efa"} Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.526883 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.530010 4907 generic.go:334] "Generic (PLEG): container finished" podID="84707a79-5b88-454b-9e1f-5618515a5623" containerID="f710d4e674080d1e903acd33ebb519996fe9e95b89ce3e7eaeb388be86f9b24b" exitCode=0 Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.532080 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7t9j5" event={"ID":"84707a79-5b88-454b-9e1f-5618515a5623","Type":"ContainerDied","Data":"f710d4e674080d1e903acd33ebb519996fe9e95b89ce3e7eaeb388be86f9b24b"} Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.532136 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7t9j5" event={"ID":"84707a79-5b88-454b-9e1f-5618515a5623","Type":"ContainerStarted","Data":"1baed4ec62510e201e98d4ff1db781e17eaeddbf66808e48c026d445c2c837b8"} Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.557694 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.576906 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.591201 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.591811 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-utilities\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.591862 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmskm\" (UniqueName: \"kubernetes.io/projected/a8f953fc-1eac-414d-b93e-e98eaa5aea79-kube-api-access-nmskm\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.591905 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-catalog-content\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.593209 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.093193316 +0000 UTC m=+149.625160805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.600724 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-utilities\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.601106 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-catalog-content\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.637139 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmskm\" (UniqueName: \"kubernetes.io/projected/a8f953fc-1eac-414d-b93e-e98eaa5aea79-kube-api-access-nmskm\") pod \"redhat-marketplace-v9qxz\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.693785 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.694534 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.194518222 +0000 UTC m=+149.726485711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.721631 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:03 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:03 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:03 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.721699 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.778216 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.798768 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.799153 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.299136412 +0000 UTC m=+149.831103901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.820821 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zjfsp"] Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.844632 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.848855 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjfsp"] Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.900638 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.900681 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78r9c\" (UniqueName: \"kubernetes.io/projected/36fc9599-8a55-4896-b7a0-531c72c7da25-kube-api-access-78r9c\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.900768 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-utilities\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:03 crc kubenswrapper[4907]: I1009 19:31:03.900790 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-catalog-content\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:03 crc kubenswrapper[4907]: E1009 19:31:03.901240 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.401211477 +0000 UTC m=+149.933179156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.006575 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.006865 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78r9c\" (UniqueName: \"kubernetes.io/projected/36fc9599-8a55-4896-b7a0-531c72c7da25-kube-api-access-78r9c\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.006960 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-utilities\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.006977 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-catalog-content\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.010195 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.510170188 +0000 UTC m=+150.042137677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.010458 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-utilities\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.011524 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-catalog-content\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:04 crc kubenswrapper[4907]: W1009 19:31:04.042236 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-b3b0a6f214b9b22781ece2a45908819d2fa7c6289496b1327453356a7ac05f24 WatchSource:0}: Error finding container b3b0a6f214b9b22781ece2a45908819d2fa7c6289496b1327453356a7ac05f24: Status 404 returned error can't find the container with id b3b0a6f214b9b22781ece2a45908819d2fa7c6289496b1327453356a7ac05f24 Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.045561 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78r9c\" (UniqueName: \"kubernetes.io/projected/36fc9599-8a55-4896-b7a0-531c72c7da25-kube-api-access-78r9c\") pod \"redhat-marketplace-zjfsp\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.109242 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.110200 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.61018484 +0000 UTC m=+150.142152339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.215075 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.215268 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.715251051 +0000 UTC m=+150.247218540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.215398 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.215717 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.715708183 +0000 UTC m=+150.247675672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.235366 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.318932 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.319216 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.819200194 +0000 UTC m=+150.351167683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.363078 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v9qxz"] Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.404172 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rddhr"] Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.405557 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.409349 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.418891 4907 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.421835 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rddhr"] Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.423316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-catalog-content\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.423389 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sm5w\" (UniqueName: \"kubernetes.io/projected/1d25de17-0079-44bd-9595-ea432cbd0982-kube-api-access-4sm5w\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.423425 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.423459 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-utilities\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.423848 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:04.923834824 +0000 UTC m=+150.455802313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.528541 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.529171 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sm5w\" (UniqueName: \"kubernetes.io/projected/1d25de17-0079-44bd-9595-ea432cbd0982-kube-api-access-4sm5w\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.529224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-utilities\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.529276 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-catalog-content\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.529732 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-catalog-content\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.529803 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.029787909 +0000 UTC m=+150.561755398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.532694 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-utilities\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.560892 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sm5w\" (UniqueName: \"kubernetes.io/projected/1d25de17-0079-44bd-9595-ea432cbd0982-kube-api-access-4sm5w\") pod \"redhat-operators-rddhr\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.569904 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7818779341b56f271282df7e4396241f532b8639a6c00a0bfd3fa265b3e24269"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.569958 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b67af281715be793fa1999c1b1e76bf5fefb86abd427d9084a7f4f029c9bec0d"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.600044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" event={"ID":"e4b6fb62-3f71-480f-b283-3da1fe2b63b5","Type":"ContainerStarted","Data":"5aba8b8750bc8f7d35718dc11112915007d4604282849205ecf3eeed84eebfe8"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.615269 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g48tm" event={"ID":"56197cfc-90ab-486e-8812-3ac1d97b9f2a","Type":"ContainerStarted","Data":"5e83321cc0d459b0490fbbda8f4d35b63d00a2e2c57d2363dcee1705e1b39019"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.623716 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v9qxz" event={"ID":"a8f953fc-1eac-414d-b93e-e98eaa5aea79","Type":"ContainerStarted","Data":"e38331277e49c1d7e971520863e0ce387869a93b0c99d4d4e144ba8cb30a8912"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.630895 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.631254 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.131238797 +0000 UTC m=+150.663206296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.638929 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" podStartSLOduration=129.638901614 podStartE2EDuration="2m9.638901614s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:04.635301421 +0000 UTC m=+150.167268910" watchObservedRunningTime="2025-10-09 19:31:04.638901614 +0000 UTC m=+150.170869103" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.644919 4907 generic.go:334] "Generic (PLEG): container finished" podID="cec75db6-ed33-4e33-b35c-44a59a054859" containerID="53f2669a7b2ff3c9d395001ca1d0e01d719bbfa8988393966f2e23a3afb178b5" exitCode=0 Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.646013 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27prd" event={"ID":"cec75db6-ed33-4e33-b35c-44a59a054859","Type":"ContainerDied","Data":"53f2669a7b2ff3c9d395001ca1d0e01d719bbfa8988393966f2e23a3afb178b5"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.660266 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4392acd772afef2d2fc924a0ca4f261e685c8d8fcf7730a019a95e6997e50599"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.661365 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cd435964afebc7043d5f27081ee79ed091943e8970db15b8359f46300e26422e"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.662962 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.679356 4907 generic.go:334] "Generic (PLEG): container finished" podID="86eff4e6-938a-48fa-a116-c46597bc0868" containerID="ecc416dcb1792cb9d8173232b33a63f5671bfd94f2df0f527d6ddd225f51c7f6" exitCode=0 Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.679451 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" event={"ID":"86eff4e6-938a-48fa-a116-c46597bc0868","Type":"ContainerDied","Data":"ecc416dcb1792cb9d8173232b33a63f5671bfd94f2df0f527d6ddd225f51c7f6"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.680169 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjfsp"] Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.682762 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c81d4da08834c40e5f055cc021d24f7b0655ea7545762157f8daa6c1e3614957"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.682810 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b3b0a6f214b9b22781ece2a45908819d2fa7c6289496b1327453356a7ac05f24"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.712311 4907 generic.go:334] "Generic (PLEG): container finished" podID="61589af1-8a53-445d-afef-ff35192b01a5" containerID="a0ff25678fa37abf81c3322eda43e83eb9654de0910382d4e5a56e622ad8560e" exitCode=0 Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.714025 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vchbq" event={"ID":"61589af1-8a53-445d-afef-ff35192b01a5","Type":"ContainerDied","Data":"a0ff25678fa37abf81c3322eda43e83eb9654de0910382d4e5a56e622ad8560e"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.714063 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vchbq" event={"ID":"61589af1-8a53-445d-afef-ff35192b01a5","Type":"ContainerStarted","Data":"448b2d7dce78ee41eec81d3d90e6a0d1a3a9a30b0dda22e564f2701aa064b337"} Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.732161 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:04 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:04 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:04 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.732247 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.733715 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-94nv4" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.734833 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.735178 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.737219 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.237191492 +0000 UTC m=+150.769158981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: W1009 19:31:04.744754 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36fc9599_8a55_4896_b7a0_531c72c7da25.slice/crio-18f8ff928c071bfd57c7641dfbdadb8895add8dc842b2b6bc8f7b6e6871f9611 WatchSource:0}: Error finding container 18f8ff928c071bfd57c7641dfbdadb8895add8dc842b2b6bc8f7b6e6871f9611: Status 404 returned error can't find the container with id 18f8ff928c071bfd57c7641dfbdadb8895add8dc842b2b6bc8f7b6e6871f9611 Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.816444 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lz9vz"] Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.824522 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.837056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.837674 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.337642055 +0000 UTC m=+150.869609704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.838730 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lz9vz"] Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.941205 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.941622 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.441591458 +0000 UTC m=+150.973558947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.942372 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-catalog-content\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.942500 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-utilities\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.942571 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:04 crc kubenswrapper[4907]: I1009 19:31:04.942655 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npc58\" (UniqueName: \"kubernetes.io/projected/ceb6c018-5abf-4ec5-b111-82d3a60ff855-kube-api-access-npc58\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:04 crc kubenswrapper[4907]: E1009 19:31:04.943189 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.443173078 +0000 UTC m=+150.975140567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.045102 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:05 crc kubenswrapper[4907]: E1009 19:31:05.045423 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.545330794 +0000 UTC m=+151.077298283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.045493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-catalog-content\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.045649 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-utilities\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.045735 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.045900 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npc58\" (UniqueName: \"kubernetes.io/projected/ceb6c018-5abf-4ec5-b111-82d3a60ff855-kube-api-access-npc58\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.046878 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-catalog-content\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:05 crc kubenswrapper[4907]: E1009 19:31:05.047080 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.547063698 +0000 UTC m=+151.079031187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.047079 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-utilities\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.061305 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rddhr"] Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.111199 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npc58\" (UniqueName: \"kubernetes.io/projected/ceb6c018-5abf-4ec5-b111-82d3a60ff855-kube-api-access-npc58\") pod \"redhat-operators-lz9vz\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.146783 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:05 crc kubenswrapper[4907]: E1009 19:31:05.147737 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.647702846 +0000 UTC m=+151.179670335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.162503 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.249493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:05 crc kubenswrapper[4907]: E1009 19:31:05.249971 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.749956636 +0000 UTC m=+151.281924125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mdh4s" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.350569 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:05 crc kubenswrapper[4907]: E1009 19:31:05.351128 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 19:31:05.851107867 +0000 UTC m=+151.383075356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.351170 4907 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-09T19:31:04.418921429Z","Handler":null,"Name":""} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.413531 4907 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.413578 4907 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.452104 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.455870 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.455899 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.506188 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mdh4s\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.553658 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.562081 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.564344 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lz9vz"] Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.570097 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:05 crc kubenswrapper[4907]: W1009 19:31:05.607751 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb6c018_5abf_4ec5_b111_82d3a60ff855.slice/crio-112fa03932009aeb48150a4eb6fba640594bf5cac3cb5fdbd77dd44cfe637a0b WatchSource:0}: Error finding container 112fa03932009aeb48150a4eb6fba640594bf5cac3cb5fdbd77dd44cfe637a0b: Status 404 returned error can't find the container with id 112fa03932009aeb48150a4eb6fba640594bf5cac3cb5fdbd77dd44cfe637a0b Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.725116 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:05 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:05 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:05 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.725191 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.754880 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g48tm" event={"ID":"56197cfc-90ab-486e-8812-3ac1d97b9f2a","Type":"ContainerStarted","Data":"a5b21f45d31f9992e6ecbebd40d440d12a92625afeae47149c547ea05a7eb578"} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.754937 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-g48tm" event={"ID":"56197cfc-90ab-486e-8812-3ac1d97b9f2a","Type":"ContainerStarted","Data":"310e9dfb61994b4c4befc598a898084319aafb21fefbc7dd160e162ff0090db7"} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.781950 4907 generic.go:334] "Generic (PLEG): container finished" podID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerID="b6e144c01c23d9ba19e9ff3e9253de3a282b6646000bfab3a81a4d8848b0e4d3" exitCode=0 Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.782057 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjfsp" event={"ID":"36fc9599-8a55-4896-b7a0-531c72c7da25","Type":"ContainerDied","Data":"b6e144c01c23d9ba19e9ff3e9253de3a282b6646000bfab3a81a4d8848b0e4d3"} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.782549 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjfsp" event={"ID":"36fc9599-8a55-4896-b7a0-531c72c7da25","Type":"ContainerStarted","Data":"18f8ff928c071bfd57c7641dfbdadb8895add8dc842b2b6bc8f7b6e6871f9611"} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.784421 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-g48tm" podStartSLOduration=12.784395666 podStartE2EDuration="12.784395666s" podCreationTimestamp="2025-10-09 19:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:05.777533681 +0000 UTC m=+151.309501180" watchObservedRunningTime="2025-10-09 19:31:05.784395666 +0000 UTC m=+151.316363155" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.810739 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.810791 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.813970 4907 patch_prober.go:28] interesting pod/console-f9d7485db-2fnwq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.814075 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2fnwq" podUID="957d72db-4cb4-4e97-bb11-2f25eb03f259" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.816433 4907 generic.go:334] "Generic (PLEG): container finished" podID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerID="b8adb9a5049ea5e279f5ab4cf514228d2e839ea5efea209cc208b1a242c6bddf" exitCode=0 Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.816590 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v9qxz" event={"ID":"a8f953fc-1eac-414d-b93e-e98eaa5aea79","Type":"ContainerDied","Data":"b8adb9a5049ea5e279f5ab4cf514228d2e839ea5efea209cc208b1a242c6bddf"} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.843003 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d25de17-0079-44bd-9595-ea432cbd0982" containerID="df622bac24007560de88f1522c4e3ce7e732016908c0ffba919e2b4aef08e8f9" exitCode=0 Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.843086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rddhr" event={"ID":"1d25de17-0079-44bd-9595-ea432cbd0982","Type":"ContainerDied","Data":"df622bac24007560de88f1522c4e3ce7e732016908c0ffba919e2b4aef08e8f9"} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.843119 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rddhr" event={"ID":"1d25de17-0079-44bd-9595-ea432cbd0982","Type":"ContainerStarted","Data":"3e7ed7c3d28513e08aa51ff5d1039e91f094f051435f10a8c492945b438d6b8b"} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.858971 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lz9vz" event={"ID":"ceb6c018-5abf-4ec5-b111-82d3a60ff855","Type":"ContainerStarted","Data":"112fa03932009aeb48150a4eb6fba640594bf5cac3cb5fdbd77dd44cfe637a0b"} Oct 09 19:31:05 crc kubenswrapper[4907]: I1009 19:31:05.948372 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mdh4s"] Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.150010 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.150981 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.157539 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.178319 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.178423 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.179029 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.179099 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.272136 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.281426 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.281541 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.281617 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.299290 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.299365 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.328825 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-754vb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.328893 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-754vb" podUID="fc36d689-70da-40c4-93ae-f5e35e414999" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.329419 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-754vb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.329441 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-754vb" podUID="fc36d689-70da-40c4-93ae-f5e35e414999" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.365480 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.386153 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume\") pod \"86eff4e6-938a-48fa-a116-c46597bc0868\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.386291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxz4c\" (UniqueName: \"kubernetes.io/projected/86eff4e6-938a-48fa-a116-c46597bc0868-kube-api-access-fxz4c\") pod \"86eff4e6-938a-48fa-a116-c46597bc0868\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.386396 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86eff4e6-938a-48fa-a116-c46597bc0868-secret-volume\") pod \"86eff4e6-938a-48fa-a116-c46597bc0868\" (UID: \"86eff4e6-938a-48fa-a116-c46597bc0868\") " Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.388176 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume" (OuterVolumeSpecName: "config-volume") pod "86eff4e6-938a-48fa-a116-c46597bc0868" (UID: "86eff4e6-938a-48fa-a116-c46597bc0868"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.392502 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86eff4e6-938a-48fa-a116-c46597bc0868-kube-api-access-fxz4c" (OuterVolumeSpecName: "kube-api-access-fxz4c") pod "86eff4e6-938a-48fa-a116-c46597bc0868" (UID: "86eff4e6-938a-48fa-a116-c46597bc0868"). InnerVolumeSpecName "kube-api-access-fxz4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.392648 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86eff4e6-938a-48fa-a116-c46597bc0868-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "86eff4e6-938a-48fa-a116-c46597bc0868" (UID: "86eff4e6-938a-48fa-a116-c46597bc0868"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.487927 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxz4c\" (UniqueName: \"kubernetes.io/projected/86eff4e6-938a-48fa-a116-c46597bc0868-kube-api-access-fxz4c\") on node \"crc\" DevicePath \"\"" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.487968 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86eff4e6-938a-48fa-a116-c46597bc0868-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.487978 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86eff4e6-938a-48fa-a116-c46597bc0868-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.556703 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.720708 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:06 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:06 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:06 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.720848 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.911542 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" event={"ID":"86eff4e6-938a-48fa-a116-c46597bc0868","Type":"ContainerDied","Data":"d8c65f0f063869102a45f0ed618e542491a340d08a3ef6ca3b553809bb0519e4"} Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.912027 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8c65f0f063869102a45f0ed618e542491a340d08a3ef6ca3b553809bb0519e4" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.912117 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.924882 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" event={"ID":"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e","Type":"ContainerStarted","Data":"560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87"} Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.924938 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" event={"ID":"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e","Type":"ContainerStarted","Data":"17bfddbb1c569a9aa47780758d22f9b650cf6e409eed4767c0bbc1d4f89face4"} Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.926051 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.939353 4907 generic.go:334] "Generic (PLEG): container finished" podID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerID="8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31" exitCode=0 Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.940448 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lz9vz" event={"ID":"ceb6c018-5abf-4ec5-b111-82d3a60ff855","Type":"ContainerDied","Data":"8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31"} Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.953836 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" podStartSLOduration=131.953810373 podStartE2EDuration="2m11.953810373s" podCreationTimestamp="2025-10-09 19:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:06.950370975 +0000 UTC m=+152.482338484" watchObservedRunningTime="2025-10-09 19:31:06.953810373 +0000 UTC m=+152.485777852" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.966100 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.966166 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.979524 4907 patch_prober.go:28] interesting pod/apiserver-76f77b778f-wgct9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]log ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]etcd ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/generic-apiserver-start-informers ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/max-in-flight-filter ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 09 19:31:06 crc kubenswrapper[4907]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 09 19:31:06 crc kubenswrapper[4907]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/project.openshift.io-projectcache ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-startinformers ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 09 19:31:06 crc kubenswrapper[4907]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 09 19:31:06 crc kubenswrapper[4907]: livez check failed Oct 09 19:31:06 crc kubenswrapper[4907]: I1009 19:31:06.979578 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" podUID="e4b6fb62-3f71-480f-b283-3da1fe2b63b5" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.007618 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 19:31:07 crc kubenswrapper[4907]: W1009 19:31:07.021235 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8fc60943_2e04_4c5c_a8e1_0bdf136be24c.slice/crio-c6fbcaed564e69f94487b92cd5f7ab0c0ab4b9b08858a7321399b0c6fa4f0da3 WatchSource:0}: Error finding container c6fbcaed564e69f94487b92cd5f7ab0c0ab4b9b08858a7321399b0c6fa4f0da3: Status 404 returned error can't find the container with id c6fbcaed564e69f94487b92cd5f7ab0c0ab4b9b08858a7321399b0c6fa4f0da3 Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.139832 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.140368 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.147420 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.161123 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.717953 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.722894 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:07 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:07 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:07 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.722988 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.967538 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8fc60943-2e04-4c5c-a8e1-0bdf136be24c","Type":"ContainerStarted","Data":"378c8dd09f441d69b5adb3c945f27dc883049de716aac2234e6445feda17b5cf"} Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.967624 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8fc60943-2e04-4c5c-a8e1-0bdf136be24c","Type":"ContainerStarted","Data":"c6fbcaed564e69f94487b92cd5f7ab0c0ab4b9b08858a7321399b0c6fa4f0da3"} Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.975565 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9qvzk" Oct 09 19:31:07 crc kubenswrapper[4907]: I1009 19:31:07.993791 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.9937610829999999 podStartE2EDuration="1.993761083s" podCreationTimestamp="2025-10-09 19:31:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:07.991101224 +0000 UTC m=+153.523068733" watchObservedRunningTime="2025-10-09 19:31:07.993761083 +0000 UTC m=+153.525728572" Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.718408 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:08 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:08 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:08 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.718941 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.890890 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 19:31:08 crc kubenswrapper[4907]: E1009 19:31:08.891260 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86eff4e6-938a-48fa-a116-c46597bc0868" containerName="collect-profiles" Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.891280 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="86eff4e6-938a-48fa-a116-c46597bc0868" containerName="collect-profiles" Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.891444 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="86eff4e6-938a-48fa-a116-c46597bc0868" containerName="collect-profiles" Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.892178 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.900726 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.901268 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.901574 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.986769 4907 generic.go:334] "Generic (PLEG): container finished" podID="8fc60943-2e04-4c5c-a8e1-0bdf136be24c" containerID="378c8dd09f441d69b5adb3c945f27dc883049de716aac2234e6445feda17b5cf" exitCode=0 Oct 09 19:31:08 crc kubenswrapper[4907]: I1009 19:31:08.988210 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8fc60943-2e04-4c5c-a8e1-0bdf136be24c","Type":"ContainerDied","Data":"378c8dd09f441d69b5adb3c945f27dc883049de716aac2234e6445feda17b5cf"} Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.031416 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec71f741-3829-46f9-aa15-5f0caa8552a1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ec71f741-3829-46f9-aa15-5f0caa8552a1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.031515 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec71f741-3829-46f9-aa15-5f0caa8552a1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ec71f741-3829-46f9-aa15-5f0caa8552a1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.134405 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec71f741-3829-46f9-aa15-5f0caa8552a1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ec71f741-3829-46f9-aa15-5f0caa8552a1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.134516 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec71f741-3829-46f9-aa15-5f0caa8552a1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ec71f741-3829-46f9-aa15-5f0caa8552a1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.134879 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec71f741-3829-46f9-aa15-5f0caa8552a1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ec71f741-3829-46f9-aa15-5f0caa8552a1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.156794 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec71f741-3829-46f9-aa15-5f0caa8552a1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ec71f741-3829-46f9-aa15-5f0caa8552a1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.230018 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.702368 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.718757 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:09 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:09 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:09 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:09 crc kubenswrapper[4907]: I1009 19:31:09.718846 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.028407 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ec71f741-3829-46f9-aa15-5f0caa8552a1","Type":"ContainerStarted","Data":"055d8dc52ffb472c42ee1a00090baacc5fe2acb8947e99ae0606aad67986e073"} Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.458371 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.583386 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kubelet-dir\") pod \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\" (UID: \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\") " Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.583596 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8fc60943-2e04-4c5c-a8e1-0bdf136be24c" (UID: "8fc60943-2e04-4c5c-a8e1-0bdf136be24c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.584068 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kube-api-access\") pod \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\" (UID: \"8fc60943-2e04-4c5c-a8e1-0bdf136be24c\") " Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.584540 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.594126 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8fc60943-2e04-4c5c-a8e1-0bdf136be24c" (UID: "8fc60943-2e04-4c5c-a8e1-0bdf136be24c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.705757 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fc60943-2e04-4c5c-a8e1-0bdf136be24c-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.719843 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:10 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:10 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:10 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:10 crc kubenswrapper[4907]: I1009 19:31:10.719924 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:11 crc kubenswrapper[4907]: I1009 19:31:11.058435 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8fc60943-2e04-4c5c-a8e1-0bdf136be24c","Type":"ContainerDied","Data":"c6fbcaed564e69f94487b92cd5f7ab0c0ab4b9b08858a7321399b0c6fa4f0da3"} Oct 09 19:31:11 crc kubenswrapper[4907]: I1009 19:31:11.058889 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6fbcaed564e69f94487b92cd5f7ab0c0ab4b9b08858a7321399b0c6fa4f0da3" Oct 09 19:31:11 crc kubenswrapper[4907]: I1009 19:31:11.058526 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 19:31:11 crc kubenswrapper[4907]: I1009 19:31:11.719515 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:11 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:11 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:11 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:11 crc kubenswrapper[4907]: I1009 19:31:11.719623 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:11 crc kubenswrapper[4907]: I1009 19:31:11.970953 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:31:11 crc kubenswrapper[4907]: I1009 19:31:11.976094 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wgct9" Oct 09 19:31:12 crc kubenswrapper[4907]: I1009 19:31:12.113729 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ec71f741-3829-46f9-aa15-5f0caa8552a1","Type":"ContainerStarted","Data":"3fb3af748854834496b215862134fbc8b6c4df9226e7143b1c18d84b15aa9465"} Oct 09 19:31:12 crc kubenswrapper[4907]: I1009 19:31:12.196047 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h9vb8" Oct 09 19:31:12 crc kubenswrapper[4907]: I1009 19:31:12.227140 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.227121894 podStartE2EDuration="4.227121894s" podCreationTimestamp="2025-10-09 19:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:12.147971628 +0000 UTC m=+157.679939127" watchObservedRunningTime="2025-10-09 19:31:12.227121894 +0000 UTC m=+157.759089383" Oct 09 19:31:12 crc kubenswrapper[4907]: I1009 19:31:12.718928 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:12 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:12 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:12 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:12 crc kubenswrapper[4907]: I1009 19:31:12.719343 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:13 crc kubenswrapper[4907]: I1009 19:31:13.124961 4907 generic.go:334] "Generic (PLEG): container finished" podID="ec71f741-3829-46f9-aa15-5f0caa8552a1" containerID="3fb3af748854834496b215862134fbc8b6c4df9226e7143b1c18d84b15aa9465" exitCode=0 Oct 09 19:31:13 crc kubenswrapper[4907]: I1009 19:31:13.125011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ec71f741-3829-46f9-aa15-5f0caa8552a1","Type":"ContainerDied","Data":"3fb3af748854834496b215862134fbc8b6c4df9226e7143b1c18d84b15aa9465"} Oct 09 19:31:13 crc kubenswrapper[4907]: I1009 19:31:13.722412 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:13 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:13 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:13 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:13 crc kubenswrapper[4907]: I1009 19:31:13.722510 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.392199 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.498057 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec71f741-3829-46f9-aa15-5f0caa8552a1-kubelet-dir\") pod \"ec71f741-3829-46f9-aa15-5f0caa8552a1\" (UID: \"ec71f741-3829-46f9-aa15-5f0caa8552a1\") " Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.498142 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec71f741-3829-46f9-aa15-5f0caa8552a1-kube-api-access\") pod \"ec71f741-3829-46f9-aa15-5f0caa8552a1\" (UID: \"ec71f741-3829-46f9-aa15-5f0caa8552a1\") " Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.498252 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec71f741-3829-46f9-aa15-5f0caa8552a1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ec71f741-3829-46f9-aa15-5f0caa8552a1" (UID: "ec71f741-3829-46f9-aa15-5f0caa8552a1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.498705 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec71f741-3829-46f9-aa15-5f0caa8552a1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.509338 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec71f741-3829-46f9-aa15-5f0caa8552a1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ec71f741-3829-46f9-aa15-5f0caa8552a1" (UID: "ec71f741-3829-46f9-aa15-5f0caa8552a1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.600020 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ec71f741-3829-46f9-aa15-5f0caa8552a1-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.717841 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:14 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:14 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:14 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:14 crc kubenswrapper[4907]: I1009 19:31:14.717899 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:15 crc kubenswrapper[4907]: I1009 19:31:15.146642 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ec71f741-3829-46f9-aa15-5f0caa8552a1","Type":"ContainerDied","Data":"055d8dc52ffb472c42ee1a00090baacc5fe2acb8947e99ae0606aad67986e073"} Oct 09 19:31:15 crc kubenswrapper[4907]: I1009 19:31:15.146684 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="055d8dc52ffb472c42ee1a00090baacc5fe2acb8947e99ae0606aad67986e073" Oct 09 19:31:15 crc kubenswrapper[4907]: I1009 19:31:15.146741 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 19:31:15 crc kubenswrapper[4907]: I1009 19:31:15.718951 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:15 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:15 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:15 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:15 crc kubenswrapper[4907]: I1009 19:31:15.719014 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:15 crc kubenswrapper[4907]: I1009 19:31:15.812825 4907 patch_prober.go:28] interesting pod/console-f9d7485db-2fnwq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 09 19:31:15 crc kubenswrapper[4907]: I1009 19:31:15.812886 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2fnwq" podUID="957d72db-4cb4-4e97-bb11-2f25eb03f259" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 09 19:31:16 crc kubenswrapper[4907]: I1009 19:31:16.327060 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-754vb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 09 19:31:16 crc kubenswrapper[4907]: I1009 19:31:16.327583 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-754vb" podUID="fc36d689-70da-40c4-93ae-f5e35e414999" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 09 19:31:16 crc kubenswrapper[4907]: I1009 19:31:16.330420 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-754vb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 09 19:31:16 crc kubenswrapper[4907]: I1009 19:31:16.330507 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-754vb" podUID="fc36d689-70da-40c4-93ae-f5e35e414999" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 09 19:31:16 crc kubenswrapper[4907]: I1009 19:31:16.718746 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:16 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:16 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:16 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:16 crc kubenswrapper[4907]: I1009 19:31:16.718840 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:16 crc kubenswrapper[4907]: I1009 19:31:16.953732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:31:17 crc kubenswrapper[4907]: I1009 19:31:17.056264 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b-metrics-certs\") pod \"network-metrics-daemon-sbjsv\" (UID: \"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b\") " pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:31:17 crc kubenswrapper[4907]: I1009 19:31:17.068586 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sbjsv" Oct 09 19:31:17 crc kubenswrapper[4907]: I1009 19:31:17.722724 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:17 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:17 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:17 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:17 crc kubenswrapper[4907]: I1009 19:31:17.723172 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:18 crc kubenswrapper[4907]: I1009 19:31:18.718665 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:18 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:18 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:18 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:18 crc kubenswrapper[4907]: I1009 19:31:18.718801 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:19 crc kubenswrapper[4907]: I1009 19:31:19.719765 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:19 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:19 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:19 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:19 crc kubenswrapper[4907]: I1009 19:31:19.719906 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:20 crc kubenswrapper[4907]: I1009 19:31:20.719644 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:20 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:20 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:20 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:20 crc kubenswrapper[4907]: I1009 19:31:20.719749 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:21 crc kubenswrapper[4907]: I1009 19:31:21.718333 4907 patch_prober.go:28] interesting pod/router-default-5444994796-mzwdh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 19:31:21 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Oct 09 19:31:21 crc kubenswrapper[4907]: [+]process-running ok Oct 09 19:31:21 crc kubenswrapper[4907]: healthz check failed Oct 09 19:31:21 crc kubenswrapper[4907]: I1009 19:31:21.718867 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mzwdh" podUID="4ef0fdd5-8e73-4320-8021-e6f28b26f248" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 19:31:22 crc kubenswrapper[4907]: I1009 19:31:22.722393 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:31:22 crc kubenswrapper[4907]: I1009 19:31:22.726733 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mzwdh" Oct 09 19:31:25 crc kubenswrapper[4907]: I1009 19:31:25.576035 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:31:25 crc kubenswrapper[4907]: I1009 19:31:25.816336 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:31:25 crc kubenswrapper[4907]: I1009 19:31:25.821158 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:31:26 crc kubenswrapper[4907]: I1009 19:31:26.334070 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-754vb" Oct 09 19:31:36 crc kubenswrapper[4907]: I1009 19:31:36.299721 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:31:36 crc kubenswrapper[4907]: I1009 19:31:36.300578 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:31:36 crc kubenswrapper[4907]: I1009 19:31:36.423061 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7dphk" Oct 09 19:31:37 crc kubenswrapper[4907]: E1009 19:31:37.264099 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 09 19:31:37 crc kubenswrapper[4907]: E1009 19:31:37.264925 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4sm5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rddhr_openshift-marketplace(1d25de17-0079-44bd-9595-ea432cbd0982): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 19:31:37 crc kubenswrapper[4907]: E1009 19:31:37.266218 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rddhr" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" Oct 09 19:31:39 crc kubenswrapper[4907]: E1009 19:31:39.320095 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rddhr" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" Oct 09 19:31:43 crc kubenswrapper[4907]: E1009 19:31:43.228532 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 19:31:43 crc kubenswrapper[4907]: E1009 19:31:43.228761 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b72p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-27prd_openshift-marketplace(cec75db6-ed33-4e33-b35c-44a59a054859): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 19:31:43 crc kubenswrapper[4907]: E1009 19:31:43.230531 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-27prd" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" Oct 09 19:31:43 crc kubenswrapper[4907]: I1009 19:31:43.856991 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 19:31:45 crc kubenswrapper[4907]: E1009 19:31:45.917343 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 19:31:45 crc kubenswrapper[4907]: E1009 19:31:45.917695 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78r9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zjfsp_openshift-marketplace(36fc9599-8a55-4896-b7a0-531c72c7da25): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 19:31:45 crc kubenswrapper[4907]: E1009 19:31:45.918993 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zjfsp" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" Oct 09 19:31:48 crc kubenswrapper[4907]: E1009 19:31:48.926772 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-27prd" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" Oct 09 19:31:48 crc kubenswrapper[4907]: E1009 19:31:48.926782 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zjfsp" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.062419 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.062652 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d2b7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7t9j5_openshift-marketplace(84707a79-5b88-454b-9e1f-5618515a5623): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.063872 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7t9j5" podUID="84707a79-5b88-454b-9e1f-5618515a5623" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.083757 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.083960 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmskm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-v9qxz_openshift-marketplace(a8f953fc-1eac-414d-b93e-e98eaa5aea79): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.085768 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-v9qxz" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.139864 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.140096 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gcbq6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vchbq_openshift-marketplace(61589af1-8a53-445d-afef-ff35192b01a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.141368 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vchbq" podUID="61589af1-8a53-445d-afef-ff35192b01a5" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.147502 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.147650 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dcxwm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-b5b4h_openshift-marketplace(bf99f768-d09e-4105-9150-39b510795216): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.149484 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-b5b4h" podUID="bf99f768-d09e-4105-9150-39b510795216" Oct 09 19:31:49 crc kubenswrapper[4907]: I1009 19:31:49.393961 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sbjsv"] Oct 09 19:31:49 crc kubenswrapper[4907]: I1009 19:31:49.403839 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lz9vz" event={"ID":"ceb6c018-5abf-4ec5-b111-82d3a60ff855","Type":"ContainerStarted","Data":"3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6"} Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.407813 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-b5b4h" podUID="bf99f768-d09e-4105-9150-39b510795216" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.407837 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-v9qxz" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.407867 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vchbq" podUID="61589af1-8a53-445d-afef-ff35192b01a5" Oct 09 19:31:49 crc kubenswrapper[4907]: E1009 19:31:49.407892 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7t9j5" podUID="84707a79-5b88-454b-9e1f-5618515a5623" Oct 09 19:31:50 crc kubenswrapper[4907]: I1009 19:31:50.412199 4907 generic.go:334] "Generic (PLEG): container finished" podID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerID="3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6" exitCode=0 Oct 09 19:31:50 crc kubenswrapper[4907]: I1009 19:31:50.412772 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lz9vz" event={"ID":"ceb6c018-5abf-4ec5-b111-82d3a60ff855","Type":"ContainerDied","Data":"3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6"} Oct 09 19:31:50 crc kubenswrapper[4907]: I1009 19:31:50.417054 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" event={"ID":"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b","Type":"ContainerStarted","Data":"cb06fcd5623c863308f6d1529241d75519e160147064ff0baaf2f6c4c5b190d4"} Oct 09 19:31:50 crc kubenswrapper[4907]: I1009 19:31:50.417127 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" event={"ID":"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b","Type":"ContainerStarted","Data":"188ef21bcb47a4840f692efc3197a936ad11ee1ed51a4d851132aed4d5761a13"} Oct 09 19:31:50 crc kubenswrapper[4907]: I1009 19:31:50.417145 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sbjsv" event={"ID":"06f1d8c6-56f8-4c4f-a6ef-eee7ad25da4b","Type":"ContainerStarted","Data":"de1214c8ae0900524853069668c82b048b10f541410115d6f6b80dae67aa8028"} Oct 09 19:31:50 crc kubenswrapper[4907]: I1009 19:31:50.447630 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sbjsv" podStartSLOduration=176.447611628 podStartE2EDuration="2m56.447611628s" podCreationTimestamp="2025-10-09 19:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:31:50.445861613 +0000 UTC m=+195.977829102" watchObservedRunningTime="2025-10-09 19:31:50.447611628 +0000 UTC m=+195.979579117" Oct 09 19:31:51 crc kubenswrapper[4907]: I1009 19:31:51.428187 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lz9vz" event={"ID":"ceb6c018-5abf-4ec5-b111-82d3a60ff855","Type":"ContainerStarted","Data":"c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209"} Oct 09 19:31:51 crc kubenswrapper[4907]: I1009 19:31:51.459041 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lz9vz" podStartSLOduration=3.439126936 podStartE2EDuration="47.459003377s" podCreationTimestamp="2025-10-09 19:31:04 +0000 UTC" firstStartedPulling="2025-10-09 19:31:06.943341305 +0000 UTC m=+152.475308794" lastFinishedPulling="2025-10-09 19:31:50.963217706 +0000 UTC m=+196.495185235" observedRunningTime="2025-10-09 19:31:51.456366493 +0000 UTC m=+196.988334002" watchObservedRunningTime="2025-10-09 19:31:51.459003377 +0000 UTC m=+196.990970866" Oct 09 19:31:55 crc kubenswrapper[4907]: I1009 19:31:55.162807 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:55 crc kubenswrapper[4907]: I1009 19:31:55.163667 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:31:56 crc kubenswrapper[4907]: I1009 19:31:56.464160 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d25de17-0079-44bd-9595-ea432cbd0982" containerID="dc02e34de8cc5797ae7c1674131f52a5184ab61aa460d8d262c027af980d1197" exitCode=0 Oct 09 19:31:56 crc kubenswrapper[4907]: I1009 19:31:56.464309 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rddhr" event={"ID":"1d25de17-0079-44bd-9595-ea432cbd0982","Type":"ContainerDied","Data":"dc02e34de8cc5797ae7c1674131f52a5184ab61aa460d8d262c027af980d1197"} Oct 09 19:31:56 crc kubenswrapper[4907]: I1009 19:31:56.472357 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lz9vz" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="registry-server" probeResult="failure" output=< Oct 09 19:31:56 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Oct 09 19:31:56 crc kubenswrapper[4907]: > Oct 09 19:31:58 crc kubenswrapper[4907]: I1009 19:31:58.478576 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rddhr" event={"ID":"1d25de17-0079-44bd-9595-ea432cbd0982","Type":"ContainerStarted","Data":"2b337a9fae11ccc3b7cff8f949d4ba92b7018116607bfddc5bb95b961251d180"} Oct 09 19:31:58 crc kubenswrapper[4907]: I1009 19:31:58.503866 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rddhr" podStartSLOduration=2.780175727 podStartE2EDuration="54.503834705s" podCreationTimestamp="2025-10-09 19:31:04 +0000 UTC" firstStartedPulling="2025-10-09 19:31:05.845608644 +0000 UTC m=+151.377576123" lastFinishedPulling="2025-10-09 19:31:57.569267602 +0000 UTC m=+203.101235101" observedRunningTime="2025-10-09 19:31:58.501182311 +0000 UTC m=+204.033149810" watchObservedRunningTime="2025-10-09 19:31:58.503834705 +0000 UTC m=+204.035802194" Oct 09 19:32:04 crc kubenswrapper[4907]: I1009 19:32:04.518161 4907 generic.go:334] "Generic (PLEG): container finished" podID="bf99f768-d09e-4105-9150-39b510795216" containerID="318d3b0311cb780240a299129789e7838c8a7b27e944f7d7a6db9e721c700ae5" exitCode=0 Oct 09 19:32:04 crc kubenswrapper[4907]: I1009 19:32:04.518213 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5b4h" event={"ID":"bf99f768-d09e-4105-9150-39b510795216","Type":"ContainerDied","Data":"318d3b0311cb780240a299129789e7838c8a7b27e944f7d7a6db9e721c700ae5"} Oct 09 19:32:04 crc kubenswrapper[4907]: I1009 19:32:04.522911 4907 generic.go:334] "Generic (PLEG): container finished" podID="cec75db6-ed33-4e33-b35c-44a59a054859" containerID="52f29d9737ceaef7419580ba885b35182e38068b353866d1e07a00ffeab0b495" exitCode=0 Oct 09 19:32:04 crc kubenswrapper[4907]: I1009 19:32:04.522969 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27prd" event={"ID":"cec75db6-ed33-4e33-b35c-44a59a054859","Type":"ContainerDied","Data":"52f29d9737ceaef7419580ba885b35182e38068b353866d1e07a00ffeab0b495"} Oct 09 19:32:04 crc kubenswrapper[4907]: I1009 19:32:04.736555 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:32:04 crc kubenswrapper[4907]: I1009 19:32:04.737671 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:32:04 crc kubenswrapper[4907]: I1009 19:32:04.799381 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:32:05 crc kubenswrapper[4907]: I1009 19:32:05.202383 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:32:05 crc kubenswrapper[4907]: I1009 19:32:05.247704 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:32:05 crc kubenswrapper[4907]: I1009 19:32:05.530317 4907 generic.go:334] "Generic (PLEG): container finished" podID="61589af1-8a53-445d-afef-ff35192b01a5" containerID="c5fe0b2d2e14b7fb67056ed574d9cbb8770e532889d464fdad1ee1d6b00b2a76" exitCode=0 Oct 09 19:32:05 crc kubenswrapper[4907]: I1009 19:32:05.530353 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vchbq" event={"ID":"61589af1-8a53-445d-afef-ff35192b01a5","Type":"ContainerDied","Data":"c5fe0b2d2e14b7fb67056ed574d9cbb8770e532889d464fdad1ee1d6b00b2a76"} Oct 09 19:32:05 crc kubenswrapper[4907]: I1009 19:32:05.580743 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.299638 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.300161 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.300235 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.301207 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.301418 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61" gracePeriod=600 Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.544516 4907 generic.go:334] "Generic (PLEG): container finished" podID="84707a79-5b88-454b-9e1f-5618515a5623" containerID="7e67af814ae925fab1067532e5e0bb977f655caa3a2261f030a900e588db23d0" exitCode=0 Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.544632 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7t9j5" event={"ID":"84707a79-5b88-454b-9e1f-5618515a5623","Type":"ContainerDied","Data":"7e67af814ae925fab1067532e5e0bb977f655caa3a2261f030a900e588db23d0"} Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.549455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vchbq" event={"ID":"61589af1-8a53-445d-afef-ff35192b01a5","Type":"ContainerStarted","Data":"906f1b289d432510620b8d64b024291d5ae3a7912e6740f1a2323cb61e4d58e1"} Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.554043 4907 generic.go:334] "Generic (PLEG): container finished" podID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerID="393b3a4cfcb2d59c34a39a571fb0001ecf0aa922e8964d8b55016c069f608379" exitCode=0 Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.554161 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjfsp" event={"ID":"36fc9599-8a55-4896-b7a0-531c72c7da25","Type":"ContainerDied","Data":"393b3a4cfcb2d59c34a39a571fb0001ecf0aa922e8964d8b55016c069f608379"} Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.557490 4907 generic.go:334] "Generic (PLEG): container finished" podID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerID="fb58b2c39769a52698638137761aaadf646fdc3eb10b8b0d8c0183f7af484620" exitCode=0 Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.557576 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v9qxz" event={"ID":"a8f953fc-1eac-414d-b93e-e98eaa5aea79","Type":"ContainerDied","Data":"fb58b2c39769a52698638137761aaadf646fdc3eb10b8b0d8c0183f7af484620"} Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.567669 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27prd" event={"ID":"cec75db6-ed33-4e33-b35c-44a59a054859","Type":"ContainerStarted","Data":"04061c66ed8cfbe6e37a6f11c38c59cc863b80c56488f354932897a666f4ad84"} Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.589195 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61" exitCode=0 Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.589342 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61"} Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.625526 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5b4h" event={"ID":"bf99f768-d09e-4105-9150-39b510795216","Type":"ContainerStarted","Data":"c893f2c17fe8f0a0f509fdc2121643ec409472bed6ef6ae2a4ef48a54384bd71"} Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.636758 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vchbq" podStartSLOduration=4.149335339 podStartE2EDuration="1m5.636740106s" podCreationTimestamp="2025-10-09 19:31:01 +0000 UTC" firstStartedPulling="2025-10-09 19:31:04.715855015 +0000 UTC m=+150.247822504" lastFinishedPulling="2025-10-09 19:32:06.203259782 +0000 UTC m=+211.735227271" observedRunningTime="2025-10-09 19:32:06.636034646 +0000 UTC m=+212.168002145" watchObservedRunningTime="2025-10-09 19:32:06.636740106 +0000 UTC m=+212.168707605" Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.745411 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-27prd" podStartSLOduration=4.596687503 podStartE2EDuration="1m5.74538075s" podCreationTimestamp="2025-10-09 19:31:01 +0000 UTC" firstStartedPulling="2025-10-09 19:31:04.648715465 +0000 UTC m=+150.180682954" lastFinishedPulling="2025-10-09 19:32:05.797408712 +0000 UTC m=+211.329376201" observedRunningTime="2025-10-09 19:32:06.742497399 +0000 UTC m=+212.274464908" watchObservedRunningTime="2025-10-09 19:32:06.74538075 +0000 UTC m=+212.277348249" Oct 09 19:32:06 crc kubenswrapper[4907]: I1009 19:32:06.765307 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b5b4h" podStartSLOduration=3.705533388 podStartE2EDuration="1m5.765287007s" podCreationTimestamp="2025-10-09 19:31:01 +0000 UTC" firstStartedPulling="2025-10-09 19:31:03.52659088 +0000 UTC m=+149.058558369" lastFinishedPulling="2025-10-09 19:32:05.586344499 +0000 UTC m=+211.118311988" observedRunningTime="2025-10-09 19:32:06.763773835 +0000 UTC m=+212.295741334" watchObservedRunningTime="2025-10-09 19:32:06.765287007 +0000 UTC m=+212.297254496" Oct 09 19:32:07 crc kubenswrapper[4907]: I1009 19:32:07.632751 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"6b36d2556ee836a20add9ae68257132453204781c07c26a6515296da113e1362"} Oct 09 19:32:08 crc kubenswrapper[4907]: I1009 19:32:08.587951 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lz9vz"] Oct 09 19:32:08 crc kubenswrapper[4907]: I1009 19:32:08.588841 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lz9vz" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="registry-server" containerID="cri-o://c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209" gracePeriod=2 Oct 09 19:32:08 crc kubenswrapper[4907]: I1009 19:32:08.643867 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjfsp" event={"ID":"36fc9599-8a55-4896-b7a0-531c72c7da25","Type":"ContainerStarted","Data":"5bccc1664f6104176b517eb495eca940ef0c5e692d61c59c4fdb26fecff12132"} Oct 09 19:32:08 crc kubenswrapper[4907]: I1009 19:32:08.645937 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v9qxz" event={"ID":"a8f953fc-1eac-414d-b93e-e98eaa5aea79","Type":"ContainerStarted","Data":"7b4cce3853802ee5950c2139ad0a54c386950adc2617c7403e75592e06081e52"} Oct 09 19:32:08 crc kubenswrapper[4907]: I1009 19:32:08.671229 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zjfsp" podStartSLOduration=4.210686633 podStartE2EDuration="1m5.671208744s" podCreationTimestamp="2025-10-09 19:31:03 +0000 UTC" firstStartedPulling="2025-10-09 19:31:05.789714293 +0000 UTC m=+151.321681782" lastFinishedPulling="2025-10-09 19:32:07.250236404 +0000 UTC m=+212.782203893" observedRunningTime="2025-10-09 19:32:08.666801111 +0000 UTC m=+214.198768610" watchObservedRunningTime="2025-10-09 19:32:08.671208744 +0000 UTC m=+214.203176253" Oct 09 19:32:08 crc kubenswrapper[4907]: I1009 19:32:08.691734 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v9qxz" podStartSLOduration=4.35520452 podStartE2EDuration="1m5.691677948s" podCreationTimestamp="2025-10-09 19:31:03 +0000 UTC" firstStartedPulling="2025-10-09 19:31:05.840895204 +0000 UTC m=+151.372862693" lastFinishedPulling="2025-10-09 19:32:07.177368632 +0000 UTC m=+212.709336121" observedRunningTime="2025-10-09 19:32:08.689281401 +0000 UTC m=+214.221248910" watchObservedRunningTime="2025-10-09 19:32:08.691677948 +0000 UTC m=+214.223645457" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.017729 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.095320 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npc58\" (UniqueName: \"kubernetes.io/projected/ceb6c018-5abf-4ec5-b111-82d3a60ff855-kube-api-access-npc58\") pod \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.095441 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-utilities\") pod \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.095561 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-catalog-content\") pod \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\" (UID: \"ceb6c018-5abf-4ec5-b111-82d3a60ff855\") " Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.096740 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-utilities" (OuterVolumeSpecName: "utilities") pod "ceb6c018-5abf-4ec5-b111-82d3a60ff855" (UID: "ceb6c018-5abf-4ec5-b111-82d3a60ff855"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.105740 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb6c018-5abf-4ec5-b111-82d3a60ff855-kube-api-access-npc58" (OuterVolumeSpecName: "kube-api-access-npc58") pod "ceb6c018-5abf-4ec5-b111-82d3a60ff855" (UID: "ceb6c018-5abf-4ec5-b111-82d3a60ff855"). InnerVolumeSpecName "kube-api-access-npc58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.196901 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npc58\" (UniqueName: \"kubernetes.io/projected/ceb6c018-5abf-4ec5-b111-82d3a60ff855-kube-api-access-npc58\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.197269 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.204112 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceb6c018-5abf-4ec5-b111-82d3a60ff855" (UID: "ceb6c018-5abf-4ec5-b111-82d3a60ff855"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.299002 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceb6c018-5abf-4ec5-b111-82d3a60ff855-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.657235 4907 generic.go:334] "Generic (PLEG): container finished" podID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerID="c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209" exitCode=0 Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.657318 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lz9vz" event={"ID":"ceb6c018-5abf-4ec5-b111-82d3a60ff855","Type":"ContainerDied","Data":"c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209"} Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.657387 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lz9vz" event={"ID":"ceb6c018-5abf-4ec5-b111-82d3a60ff855","Type":"ContainerDied","Data":"112fa03932009aeb48150a4eb6fba640594bf5cac3cb5fdbd77dd44cfe637a0b"} Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.657443 4907 scope.go:117] "RemoveContainer" containerID="c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.657602 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lz9vz" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.664311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7t9j5" event={"ID":"84707a79-5b88-454b-9e1f-5618515a5623","Type":"ContainerStarted","Data":"e1a1453bbf7df68b944dfea1b164d0076eb52926ebc067ac339cb1b6c1eeb6a0"} Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.688316 4907 scope.go:117] "RemoveContainer" containerID="3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.711792 4907 scope.go:117] "RemoveContainer" containerID="8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.727543 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7t9j5" podStartSLOduration=3.577960487 podStartE2EDuration="1m8.727516007s" podCreationTimestamp="2025-10-09 19:31:01 +0000 UTC" firstStartedPulling="2025-10-09 19:31:03.563105436 +0000 UTC m=+149.095072925" lastFinishedPulling="2025-10-09 19:32:08.712660956 +0000 UTC m=+214.244628445" observedRunningTime="2025-10-09 19:32:09.714084851 +0000 UTC m=+215.246052340" watchObservedRunningTime="2025-10-09 19:32:09.727516007 +0000 UTC m=+215.259483496" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.730839 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lz9vz"] Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.737507 4907 scope.go:117] "RemoveContainer" containerID="c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209" Oct 09 19:32:09 crc kubenswrapper[4907]: E1009 19:32:09.738432 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209\": container with ID starting with c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209 not found: ID does not exist" containerID="c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.738484 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209"} err="failed to get container status \"c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209\": rpc error: code = NotFound desc = could not find container \"c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209\": container with ID starting with c9974a01b0f84dc48656923b617b3fcba135ef4b17df193dc4d3a5b67945b209 not found: ID does not exist" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.738518 4907 scope.go:117] "RemoveContainer" containerID="3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6" Oct 09 19:32:09 crc kubenswrapper[4907]: E1009 19:32:09.739871 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6\": container with ID starting with 3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6 not found: ID does not exist" containerID="3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.739908 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6"} err="failed to get container status \"3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6\": rpc error: code = NotFound desc = could not find container \"3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6\": container with ID starting with 3d3a92eb7e6b6d7bfa4a2e766fa1a1eb8f50246de7f3b4e5e5e745efb7be42b6 not found: ID does not exist" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.739941 4907 scope.go:117] "RemoveContainer" containerID="8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.740116 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lz9vz"] Oct 09 19:32:09 crc kubenswrapper[4907]: E1009 19:32:09.740276 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31\": container with ID starting with 8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31 not found: ID does not exist" containerID="8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31" Oct 09 19:32:09 crc kubenswrapper[4907]: I1009 19:32:09.740311 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31"} err="failed to get container status \"8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31\": rpc error: code = NotFound desc = could not find container \"8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31\": container with ID starting with 8d60b59aa7caa6e32530e0d760c03750946aac9127b60f311fd8d0c2ab125f31 not found: ID does not exist" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.160317 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" path="/var/lib/kubelet/pods/ceb6c018-5abf-4ec5-b111-82d3a60ff855/volumes" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.587175 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.587700 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.636704 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.790757 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.791348 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.832211 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.983921 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:32:11 crc kubenswrapper[4907]: I1009 19:32:11.984248 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:32:12 crc kubenswrapper[4907]: I1009 19:32:12.033872 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:32:12 crc kubenswrapper[4907]: I1009 19:32:12.207153 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:32:12 crc kubenswrapper[4907]: I1009 19:32:12.207994 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:32:12 crc kubenswrapper[4907]: I1009 19:32:12.265600 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:32:12 crc kubenswrapper[4907]: I1009 19:32:12.728739 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:32:12 crc kubenswrapper[4907]: I1009 19:32:12.737178 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:32:12 crc kubenswrapper[4907]: I1009 19:32:12.746882 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:32:13 crc kubenswrapper[4907]: I1009 19:32:13.779709 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:32:13 crc kubenswrapper[4907]: I1009 19:32:13.780756 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:32:13 crc kubenswrapper[4907]: I1009 19:32:13.833268 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:32:14 crc kubenswrapper[4907]: I1009 19:32:14.235880 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:32:14 crc kubenswrapper[4907]: I1009 19:32:14.235939 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:32:14 crc kubenswrapper[4907]: I1009 19:32:14.276945 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:32:14 crc kubenswrapper[4907]: I1009 19:32:14.387273 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vchbq"] Oct 09 19:32:14 crc kubenswrapper[4907]: I1009 19:32:14.763393 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:32:14 crc kubenswrapper[4907]: I1009 19:32:14.770761 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:32:15 crc kubenswrapper[4907]: I1009 19:32:15.703110 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vchbq" podUID="61589af1-8a53-445d-afef-ff35192b01a5" containerName="registry-server" containerID="cri-o://906f1b289d432510620b8d64b024291d5ae3a7912e6740f1a2323cb61e4d58e1" gracePeriod=2 Oct 09 19:32:16 crc kubenswrapper[4907]: I1009 19:32:16.787994 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27prd"] Oct 09 19:32:16 crc kubenswrapper[4907]: I1009 19:32:16.788312 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-27prd" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" containerName="registry-server" containerID="cri-o://04061c66ed8cfbe6e37a6f11c38c59cc863b80c56488f354932897a666f4ad84" gracePeriod=2 Oct 09 19:32:17 crc kubenswrapper[4907]: I1009 19:32:17.716250 4907 generic.go:334] "Generic (PLEG): container finished" podID="61589af1-8a53-445d-afef-ff35192b01a5" containerID="906f1b289d432510620b8d64b024291d5ae3a7912e6740f1a2323cb61e4d58e1" exitCode=0 Oct 09 19:32:17 crc kubenswrapper[4907]: I1009 19:32:17.716330 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vchbq" event={"ID":"61589af1-8a53-445d-afef-ff35192b01a5","Type":"ContainerDied","Data":"906f1b289d432510620b8d64b024291d5ae3a7912e6740f1a2323cb61e4d58e1"} Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.304431 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.442211 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcbq6\" (UniqueName: \"kubernetes.io/projected/61589af1-8a53-445d-afef-ff35192b01a5-kube-api-access-gcbq6\") pod \"61589af1-8a53-445d-afef-ff35192b01a5\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.442325 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-utilities\") pod \"61589af1-8a53-445d-afef-ff35192b01a5\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.442505 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-catalog-content\") pod \"61589af1-8a53-445d-afef-ff35192b01a5\" (UID: \"61589af1-8a53-445d-afef-ff35192b01a5\") " Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.443574 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-utilities" (OuterVolumeSpecName: "utilities") pod "61589af1-8a53-445d-afef-ff35192b01a5" (UID: "61589af1-8a53-445d-afef-ff35192b01a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.451152 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61589af1-8a53-445d-afef-ff35192b01a5-kube-api-access-gcbq6" (OuterVolumeSpecName: "kube-api-access-gcbq6") pod "61589af1-8a53-445d-afef-ff35192b01a5" (UID: "61589af1-8a53-445d-afef-ff35192b01a5"). InnerVolumeSpecName "kube-api-access-gcbq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.544171 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcbq6\" (UniqueName: \"kubernetes.io/projected/61589af1-8a53-445d-afef-ff35192b01a5-kube-api-access-gcbq6\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.544207 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.579961 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61589af1-8a53-445d-afef-ff35192b01a5" (UID: "61589af1-8a53-445d-afef-ff35192b01a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.645794 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61589af1-8a53-445d-afef-ff35192b01a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.725323 4907 generic.go:334] "Generic (PLEG): container finished" podID="cec75db6-ed33-4e33-b35c-44a59a054859" containerID="04061c66ed8cfbe6e37a6f11c38c59cc863b80c56488f354932897a666f4ad84" exitCode=0 Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.725405 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27prd" event={"ID":"cec75db6-ed33-4e33-b35c-44a59a054859","Type":"ContainerDied","Data":"04061c66ed8cfbe6e37a6f11c38c59cc863b80c56488f354932897a666f4ad84"} Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.728132 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vchbq" event={"ID":"61589af1-8a53-445d-afef-ff35192b01a5","Type":"ContainerDied","Data":"448b2d7dce78ee41eec81d3d90e6a0d1a3a9a30b0dda22e564f2701aa064b337"} Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.728197 4907 scope.go:117] "RemoveContainer" containerID="906f1b289d432510620b8d64b024291d5ae3a7912e6740f1a2323cb61e4d58e1" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.728345 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vchbq" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.767323 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vchbq"] Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.768931 4907 scope.go:117] "RemoveContainer" containerID="c5fe0b2d2e14b7fb67056ed574d9cbb8770e532889d464fdad1ee1d6b00b2a76" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.771192 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vchbq"] Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.796743 4907 scope.go:117] "RemoveContainer" containerID="a0ff25678fa37abf81c3322eda43e83eb9654de0910382d4e5a56e622ad8560e" Oct 09 19:32:18 crc kubenswrapper[4907]: I1009 19:32:18.924181 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.050627 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b72p5\" (UniqueName: \"kubernetes.io/projected/cec75db6-ed33-4e33-b35c-44a59a054859-kube-api-access-b72p5\") pod \"cec75db6-ed33-4e33-b35c-44a59a054859\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.050745 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-catalog-content\") pod \"cec75db6-ed33-4e33-b35c-44a59a054859\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.050803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-utilities\") pod \"cec75db6-ed33-4e33-b35c-44a59a054859\" (UID: \"cec75db6-ed33-4e33-b35c-44a59a054859\") " Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.051648 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-utilities" (OuterVolumeSpecName: "utilities") pod "cec75db6-ed33-4e33-b35c-44a59a054859" (UID: "cec75db6-ed33-4e33-b35c-44a59a054859"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.055822 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec75db6-ed33-4e33-b35c-44a59a054859-kube-api-access-b72p5" (OuterVolumeSpecName: "kube-api-access-b72p5") pod "cec75db6-ed33-4e33-b35c-44a59a054859" (UID: "cec75db6-ed33-4e33-b35c-44a59a054859"). InnerVolumeSpecName "kube-api-access-b72p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.106668 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cec75db6-ed33-4e33-b35c-44a59a054859" (UID: "cec75db6-ed33-4e33-b35c-44a59a054859"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.152832 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.153154 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec75db6-ed33-4e33-b35c-44a59a054859-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.153293 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b72p5\" (UniqueName: \"kubernetes.io/projected/cec75db6-ed33-4e33-b35c-44a59a054859-kube-api-access-b72p5\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.164038 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61589af1-8a53-445d-afef-ff35192b01a5" path="/var/lib/kubelet/pods/61589af1-8a53-445d-afef-ff35192b01a5/volumes" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.187168 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjfsp"] Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.187510 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zjfsp" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerName="registry-server" containerID="cri-o://5bccc1664f6104176b517eb495eca940ef0c5e692d61c59c4fdb26fecff12132" gracePeriod=2 Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.735898 4907 generic.go:334] "Generic (PLEG): container finished" podID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerID="5bccc1664f6104176b517eb495eca940ef0c5e692d61c59c4fdb26fecff12132" exitCode=0 Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.735962 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjfsp" event={"ID":"36fc9599-8a55-4896-b7a0-531c72c7da25","Type":"ContainerDied","Data":"5bccc1664f6104176b517eb495eca940ef0c5e692d61c59c4fdb26fecff12132"} Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.738025 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-27prd" event={"ID":"cec75db6-ed33-4e33-b35c-44a59a054859","Type":"ContainerDied","Data":"43ee5baf42a0568e986898ba144b1414f9f955480e7d42b07277f7360f63fae6"} Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.738063 4907 scope.go:117] "RemoveContainer" containerID="04061c66ed8cfbe6e37a6f11c38c59cc863b80c56488f354932897a666f4ad84" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.738160 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-27prd" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.751737 4907 scope.go:117] "RemoveContainer" containerID="52f29d9737ceaef7419580ba885b35182e38068b353866d1e07a00ffeab0b495" Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.768920 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-27prd"] Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.774988 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-27prd"] Oct 09 19:32:19 crc kubenswrapper[4907]: I1009 19:32:19.777009 4907 scope.go:117] "RemoveContainer" containerID="53f2669a7b2ff3c9d395001ca1d0e01d719bbfa8988393966f2e23a3afb178b5" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.236203 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.371641 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78r9c\" (UniqueName: \"kubernetes.io/projected/36fc9599-8a55-4896-b7a0-531c72c7da25-kube-api-access-78r9c\") pod \"36fc9599-8a55-4896-b7a0-531c72c7da25\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.371756 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-catalog-content\") pod \"36fc9599-8a55-4896-b7a0-531c72c7da25\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.371932 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-utilities\") pod \"36fc9599-8a55-4896-b7a0-531c72c7da25\" (UID: \"36fc9599-8a55-4896-b7a0-531c72c7da25\") " Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.373024 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-utilities" (OuterVolumeSpecName: "utilities") pod "36fc9599-8a55-4896-b7a0-531c72c7da25" (UID: "36fc9599-8a55-4896-b7a0-531c72c7da25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.378048 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fc9599-8a55-4896-b7a0-531c72c7da25-kube-api-access-78r9c" (OuterVolumeSpecName: "kube-api-access-78r9c") pod "36fc9599-8a55-4896-b7a0-531c72c7da25" (UID: "36fc9599-8a55-4896-b7a0-531c72c7da25"). InnerVolumeSpecName "kube-api-access-78r9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.388646 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36fc9599-8a55-4896-b7a0-531c72c7da25" (UID: "36fc9599-8a55-4896-b7a0-531c72c7da25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.473961 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78r9c\" (UniqueName: \"kubernetes.io/projected/36fc9599-8a55-4896-b7a0-531c72c7da25-kube-api-access-78r9c\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.474016 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.474047 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36fc9599-8a55-4896-b7a0-531c72c7da25-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.751739 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zjfsp" event={"ID":"36fc9599-8a55-4896-b7a0-531c72c7da25","Type":"ContainerDied","Data":"18f8ff928c071bfd57c7641dfbdadb8895add8dc842b2b6bc8f7b6e6871f9611"} Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.751836 4907 scope.go:117] "RemoveContainer" containerID="5bccc1664f6104176b517eb495eca940ef0c5e692d61c59c4fdb26fecff12132" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.751873 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zjfsp" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.774893 4907 scope.go:117] "RemoveContainer" containerID="393b3a4cfcb2d59c34a39a571fb0001ecf0aa922e8964d8b55016c069f608379" Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.791667 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjfsp"] Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.797905 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zjfsp"] Oct 09 19:32:20 crc kubenswrapper[4907]: I1009 19:32:20.820670 4907 scope.go:117] "RemoveContainer" containerID="b6e144c01c23d9ba19e9ff3e9253de3a282b6646000bfab3a81a4d8848b0e4d3" Oct 09 19:32:21 crc kubenswrapper[4907]: I1009 19:32:21.159395 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" path="/var/lib/kubelet/pods/36fc9599-8a55-4896-b7a0-531c72c7da25/volumes" Oct 09 19:32:21 crc kubenswrapper[4907]: I1009 19:32:21.160778 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" path="/var/lib/kubelet/pods/cec75db6-ed33-4e33-b35c-44a59a054859/volumes" Oct 09 19:32:21 crc kubenswrapper[4907]: I1009 19:32:21.640881 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:32:25 crc kubenswrapper[4907]: I1009 19:32:25.659336 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hztkx"] Oct 09 19:32:50 crc kubenswrapper[4907]: I1009 19:32:50.697473 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" podUID="efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" containerName="oauth-openshift" containerID="cri-o://e0df03f9816375e9304fd9c8c3a77c69bde110a2eaf019bdea3454dd0fdeff7b" gracePeriod=15 Oct 09 19:32:50 crc kubenswrapper[4907]: I1009 19:32:50.977322 4907 generic.go:334] "Generic (PLEG): container finished" podID="efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" containerID="e0df03f9816375e9304fd9c8c3a77c69bde110a2eaf019bdea3454dd0fdeff7b" exitCode=0 Oct 09 19:32:50 crc kubenswrapper[4907]: I1009 19:32:50.977385 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" event={"ID":"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79","Type":"ContainerDied","Data":"e0df03f9816375e9304fd9c8c3a77c69bde110a2eaf019bdea3454dd0fdeff7b"} Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.127794 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.180708 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6bf5fff678-j7ztb"] Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181173 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerName="extract-content" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181198 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerName="extract-content" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181223 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="extract-content" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181235 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="extract-content" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181248 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61589af1-8a53-445d-afef-ff35192b01a5" containerName="extract-content" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181258 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="61589af1-8a53-445d-afef-ff35192b01a5" containerName="extract-content" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181270 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerName="extract-utilities" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181278 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerName="extract-utilities" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181291 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="extract-utilities" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181300 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="extract-utilities" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181310 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" containerName="extract-content" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181318 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" containerName="extract-content" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181337 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" containerName="oauth-openshift" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181345 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" containerName="oauth-openshift" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181356 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc60943-2e04-4c5c-a8e1-0bdf136be24c" containerName="pruner" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181363 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc60943-2e04-4c5c-a8e1-0bdf136be24c" containerName="pruner" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181375 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec71f741-3829-46f9-aa15-5f0caa8552a1" containerName="pruner" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181382 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec71f741-3829-46f9-aa15-5f0caa8552a1" containerName="pruner" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181397 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" containerName="extract-utilities" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181407 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" containerName="extract-utilities" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181417 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181425 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181440 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61589af1-8a53-445d-afef-ff35192b01a5" containerName="extract-utilities" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181448 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="61589af1-8a53-445d-afef-ff35192b01a5" containerName="extract-utilities" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181458 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61589af1-8a53-445d-afef-ff35192b01a5" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181483 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="61589af1-8a53-445d-afef-ff35192b01a5" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181501 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181513 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: E1009 19:32:51.181525 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181534 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181688 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec75db6-ed33-4e33-b35c-44a59a054859" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181705 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" containerName="oauth-openshift" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181714 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc60943-2e04-4c5c-a8e1-0bdf136be24c" containerName="pruner" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181725 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec71f741-3829-46f9-aa15-5f0caa8552a1" containerName="pruner" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181733 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fc9599-8a55-4896-b7a0-531c72c7da25" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181742 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="61589af1-8a53-445d-afef-ff35192b01a5" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.181751 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb6c018-5abf-4ec5-b111-82d3a60ff855" containerName="registry-server" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.182255 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bf5fff678-j7ztb"] Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.182356 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274067 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-cliconfig\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274136 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-login\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274184 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w62l9\" (UniqueName: \"kubernetes.io/projected/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-kube-api-access-w62l9\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274231 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-trusted-ca-bundle\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274276 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-serving-cert\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274311 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-service-ca\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274343 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-policies\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274373 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-router-certs\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274414 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-provider-selection\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274445 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-session\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274472 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-error\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274522 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-idp-0-file-data\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274560 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-ocp-branding-template\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274614 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-dir\") pod \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\" (UID: \"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79\") " Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.274834 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-session\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.275415 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.275723 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.275742 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.275169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-login\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.275896 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.275919 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.275957 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjc5k\" (UniqueName: \"kubernetes.io/projected/9afd8e2f-d58c-456e-b140-1ffd0af14f18-kube-api-access-kjc5k\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.275994 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9afd8e2f-d58c-456e-b140-1ffd0af14f18-audit-dir\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276037 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-error\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276090 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276165 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276207 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276244 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276281 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-audit-policies\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276324 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276362 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276439 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276469 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276516 4907 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276534 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.276844 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.282620 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.283311 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-kube-api-access-w62l9" (OuterVolumeSpecName: "kube-api-access-w62l9") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "kube-api-access-w62l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.283323 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.283793 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.284163 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.285719 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.286080 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.289313 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.294883 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" (UID: "efb7004c-f1a6-4f5f-9358-c72a1b2c9b79"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378154 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-audit-policies\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378267 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378300 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378327 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378352 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-session\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378434 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-login\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378472 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378603 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjc5k\" (UniqueName: \"kubernetes.io/projected/9afd8e2f-d58c-456e-b140-1ffd0af14f18-kube-api-access-kjc5k\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378632 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9afd8e2f-d58c-456e-b140-1ffd0af14f18-audit-dir\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378656 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-error\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378723 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378749 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378800 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378817 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378835 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378849 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378862 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378876 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378890 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378905 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378919 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w62l9\" (UniqueName: \"kubernetes.io/projected/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-kube-api-access-w62l9\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.378932 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.379158 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-audit-policies\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.379333 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.379607 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9afd8e2f-d58c-456e-b140-1ffd0af14f18-audit-dir\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.379675 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-service-ca\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.380409 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.384040 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-session\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.384184 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-login\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.384060 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.384104 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-error\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.384424 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.385527 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.385789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-system-router-certs\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.385903 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9afd8e2f-d58c-456e-b140-1ffd0af14f18-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.411147 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjc5k\" (UniqueName: \"kubernetes.io/projected/9afd8e2f-d58c-456e-b140-1ffd0af14f18-kube-api-access-kjc5k\") pod \"oauth-openshift-6bf5fff678-j7ztb\" (UID: \"9afd8e2f-d58c-456e-b140-1ffd0af14f18\") " pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.504432 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.986053 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" event={"ID":"efb7004c-f1a6-4f5f-9358-c72a1b2c9b79","Type":"ContainerDied","Data":"92d93cba88f43252dd356489c1d3aa956e8d09791d3cb6b5e37062860bd3bd08"} Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.986158 4907 scope.go:117] "RemoveContainer" containerID="e0df03f9816375e9304fd9c8c3a77c69bde110a2eaf019bdea3454dd0fdeff7b" Oct 09 19:32:51 crc kubenswrapper[4907]: I1009 19:32:51.986191 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hztkx" Oct 09 19:32:52 crc kubenswrapper[4907]: I1009 19:32:51.998863 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6bf5fff678-j7ztb"] Oct 09 19:32:52 crc kubenswrapper[4907]: I1009 19:32:52.030416 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hztkx"] Oct 09 19:32:52 crc kubenswrapper[4907]: I1009 19:32:52.031914 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hztkx"] Oct 09 19:32:52 crc kubenswrapper[4907]: I1009 19:32:52.995173 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" event={"ID":"9afd8e2f-d58c-456e-b140-1ffd0af14f18","Type":"ContainerStarted","Data":"0c52ebd6d24af5d8730cab8a15b1cc2c4b0040f25e038edf7343adf92e18c4b5"} Oct 09 19:32:52 crc kubenswrapper[4907]: I1009 19:32:52.995231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" event={"ID":"9afd8e2f-d58c-456e-b140-1ffd0af14f18","Type":"ContainerStarted","Data":"8a7734ab4367e7cd9c5c26c40624003e4409a16a050be9c4e69ee971a51cff64"} Oct 09 19:32:52 crc kubenswrapper[4907]: I1009 19:32:52.995630 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:53 crc kubenswrapper[4907]: I1009 19:32:53.001058 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" Oct 09 19:32:53 crc kubenswrapper[4907]: I1009 19:32:53.024130 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6bf5fff678-j7ztb" podStartSLOduration=28.024111236 podStartE2EDuration="28.024111236s" podCreationTimestamp="2025-10-09 19:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:32:53.020843904 +0000 UTC m=+258.552811453" watchObservedRunningTime="2025-10-09 19:32:53.024111236 +0000 UTC m=+258.556078725" Oct 09 19:32:53 crc kubenswrapper[4907]: I1009 19:32:53.159594 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb7004c-f1a6-4f5f-9358-c72a1b2c9b79" path="/var/lib/kubelet/pods/efb7004c-f1a6-4f5f-9358-c72a1b2c9b79/volumes" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.271168 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5b4h"] Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.272225 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b5b4h" podUID="bf99f768-d09e-4105-9150-39b510795216" containerName="registry-server" containerID="cri-o://c893f2c17fe8f0a0f509fdc2121643ec409472bed6ef6ae2a4ef48a54384bd71" gracePeriod=30 Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.284196 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7t9j5"] Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.284614 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7t9j5" podUID="84707a79-5b88-454b-9e1f-5618515a5623" containerName="registry-server" containerID="cri-o://e1a1453bbf7df68b944dfea1b164d0076eb52926ebc067ac339cb1b6c1eeb6a0" gracePeriod=30 Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.303023 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6rv64"] Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.303273 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" podUID="92eb9688-52c0-4ba4-8a82-3f874d85e2cf" containerName="marketplace-operator" containerID="cri-o://752d4c754c30fbb378d359ec3690e73283c5e0aec792559042ad2d540303decf" gracePeriod=30 Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.307721 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v9qxz"] Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.308080 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v9qxz" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerName="registry-server" containerID="cri-o://7b4cce3853802ee5950c2139ad0a54c386950adc2617c7403e75592e06081e52" gracePeriod=30 Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.324830 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dmrx4"] Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.325865 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.333050 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rddhr"] Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.333861 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rddhr" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" containerName="registry-server" containerID="cri-o://2b337a9fae11ccc3b7cff8f949d4ba92b7018116607bfddc5bb95b961251d180" gracePeriod=30 Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.345227 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dmrx4"] Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.460633 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d043494b-98ab-482a-ba53-5f2445d01bea-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.461229 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d043494b-98ab-482a-ba53-5f2445d01bea-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.461460 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmr22\" (UniqueName: \"kubernetes.io/projected/d043494b-98ab-482a-ba53-5f2445d01bea-kube-api-access-bmr22\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.563086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d043494b-98ab-482a-ba53-5f2445d01bea-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.563170 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmr22\" (UniqueName: \"kubernetes.io/projected/d043494b-98ab-482a-ba53-5f2445d01bea-kube-api-access-bmr22\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.563204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d043494b-98ab-482a-ba53-5f2445d01bea-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.564590 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d043494b-98ab-482a-ba53-5f2445d01bea-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.571750 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d043494b-98ab-482a-ba53-5f2445d01bea-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.580329 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmr22\" (UniqueName: \"kubernetes.io/projected/d043494b-98ab-482a-ba53-5f2445d01bea-kube-api-access-bmr22\") pod \"marketplace-operator-79b997595-dmrx4\" (UID: \"d043494b-98ab-482a-ba53-5f2445d01bea\") " pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:08 crc kubenswrapper[4907]: I1009 19:33:08.658834 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.105344 4907 generic.go:334] "Generic (PLEG): container finished" podID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerID="7b4cce3853802ee5950c2139ad0a54c386950adc2617c7403e75592e06081e52" exitCode=0 Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.105437 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v9qxz" event={"ID":"a8f953fc-1eac-414d-b93e-e98eaa5aea79","Type":"ContainerDied","Data":"7b4cce3853802ee5950c2139ad0a54c386950adc2617c7403e75592e06081e52"} Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.112434 4907 generic.go:334] "Generic (PLEG): container finished" podID="1d25de17-0079-44bd-9595-ea432cbd0982" containerID="2b337a9fae11ccc3b7cff8f949d4ba92b7018116607bfddc5bb95b961251d180" exitCode=0 Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.112519 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rddhr" event={"ID":"1d25de17-0079-44bd-9595-ea432cbd0982","Type":"ContainerDied","Data":"2b337a9fae11ccc3b7cff8f949d4ba92b7018116607bfddc5bb95b961251d180"} Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.117337 4907 generic.go:334] "Generic (PLEG): container finished" podID="bf99f768-d09e-4105-9150-39b510795216" containerID="c893f2c17fe8f0a0f509fdc2121643ec409472bed6ef6ae2a4ef48a54384bd71" exitCode=0 Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.117412 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5b4h" event={"ID":"bf99f768-d09e-4105-9150-39b510795216","Type":"ContainerDied","Data":"c893f2c17fe8f0a0f509fdc2121643ec409472bed6ef6ae2a4ef48a54384bd71"} Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.121038 4907 generic.go:334] "Generic (PLEG): container finished" podID="92eb9688-52c0-4ba4-8a82-3f874d85e2cf" containerID="752d4c754c30fbb378d359ec3690e73283c5e0aec792559042ad2d540303decf" exitCode=0 Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.121110 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" event={"ID":"92eb9688-52c0-4ba4-8a82-3f874d85e2cf","Type":"ContainerDied","Data":"752d4c754c30fbb378d359ec3690e73283c5e0aec792559042ad2d540303decf"} Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.144828 4907 generic.go:334] "Generic (PLEG): container finished" podID="84707a79-5b88-454b-9e1f-5618515a5623" containerID="e1a1453bbf7df68b944dfea1b164d0076eb52926ebc067ac339cb1b6c1eeb6a0" exitCode=0 Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.144889 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7t9j5" event={"ID":"84707a79-5b88-454b-9e1f-5618515a5623","Type":"ContainerDied","Data":"e1a1453bbf7df68b944dfea1b164d0076eb52926ebc067ac339cb1b6c1eeb6a0"} Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.183875 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dmrx4"] Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.237859 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.375080 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-utilities\") pod \"bf99f768-d09e-4105-9150-39b510795216\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.375356 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-catalog-content\") pod \"bf99f768-d09e-4105-9150-39b510795216\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.375429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcxwm\" (UniqueName: \"kubernetes.io/projected/bf99f768-d09e-4105-9150-39b510795216-kube-api-access-dcxwm\") pod \"bf99f768-d09e-4105-9150-39b510795216\" (UID: \"bf99f768-d09e-4105-9150-39b510795216\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.377481 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-utilities" (OuterVolumeSpecName: "utilities") pod "bf99f768-d09e-4105-9150-39b510795216" (UID: "bf99f768-d09e-4105-9150-39b510795216"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.383043 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf99f768-d09e-4105-9150-39b510795216-kube-api-access-dcxwm" (OuterVolumeSpecName: "kube-api-access-dcxwm") pod "bf99f768-d09e-4105-9150-39b510795216" (UID: "bf99f768-d09e-4105-9150-39b510795216"). InnerVolumeSpecName "kube-api-access-dcxwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.429260 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.445874 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf99f768-d09e-4105-9150-39b510795216" (UID: "bf99f768-d09e-4105-9150-39b510795216"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.476936 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcxwm\" (UniqueName: \"kubernetes.io/projected/bf99f768-d09e-4105-9150-39b510795216-kube-api-access-dcxwm\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.476974 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.476989 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf99f768-d09e-4105-9150-39b510795216-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.551493 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.555166 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.561001 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.578793 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-catalog-content\") pod \"1d25de17-0079-44bd-9595-ea432cbd0982\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.578851 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-utilities\") pod \"1d25de17-0079-44bd-9595-ea432cbd0982\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.579910 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-utilities" (OuterVolumeSpecName: "utilities") pod "1d25de17-0079-44bd-9595-ea432cbd0982" (UID: "1d25de17-0079-44bd-9595-ea432cbd0982"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.580023 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sm5w\" (UniqueName: \"kubernetes.io/projected/1d25de17-0079-44bd-9595-ea432cbd0982-kube-api-access-4sm5w\") pod \"1d25de17-0079-44bd-9595-ea432cbd0982\" (UID: \"1d25de17-0079-44bd-9595-ea432cbd0982\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.580965 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.585554 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d25de17-0079-44bd-9595-ea432cbd0982-kube-api-access-4sm5w" (OuterVolumeSpecName: "kube-api-access-4sm5w") pod "1d25de17-0079-44bd-9595-ea432cbd0982" (UID: "1d25de17-0079-44bd-9595-ea432cbd0982"). InnerVolumeSpecName "kube-api-access-4sm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.677665 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d25de17-0079-44bd-9595-ea432cbd0982" (UID: "1d25de17-0079-44bd-9595-ea432cbd0982"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682113 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-catalog-content\") pod \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682185 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2b7k\" (UniqueName: \"kubernetes.io/projected/84707a79-5b88-454b-9e1f-5618515a5623-kube-api-access-d2b7k\") pod \"84707a79-5b88-454b-9e1f-5618515a5623\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682228 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-catalog-content\") pod \"84707a79-5b88-454b-9e1f-5618515a5623\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdx2b\" (UniqueName: \"kubernetes.io/projected/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-kube-api-access-vdx2b\") pod \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682322 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-trusted-ca\") pod \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682352 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-operator-metrics\") pod \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\" (UID: \"92eb9688-52c0-4ba4-8a82-3f874d85e2cf\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682401 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-utilities\") pod \"84707a79-5b88-454b-9e1f-5618515a5623\" (UID: \"84707a79-5b88-454b-9e1f-5618515a5623\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682436 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-utilities\") pod \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682513 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmskm\" (UniqueName: \"kubernetes.io/projected/a8f953fc-1eac-414d-b93e-e98eaa5aea79-kube-api-access-nmskm\") pod \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\" (UID: \"a8f953fc-1eac-414d-b93e-e98eaa5aea79\") " Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682826 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sm5w\" (UniqueName: \"kubernetes.io/projected/1d25de17-0079-44bd-9595-ea432cbd0982-kube-api-access-4sm5w\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.682844 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d25de17-0079-44bd-9595-ea432cbd0982-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.683202 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "92eb9688-52c0-4ba4-8a82-3f874d85e2cf" (UID: "92eb9688-52c0-4ba4-8a82-3f874d85e2cf"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.683297 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-utilities" (OuterVolumeSpecName: "utilities") pod "84707a79-5b88-454b-9e1f-5618515a5623" (UID: "84707a79-5b88-454b-9e1f-5618515a5623"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.683297 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-utilities" (OuterVolumeSpecName: "utilities") pod "a8f953fc-1eac-414d-b93e-e98eaa5aea79" (UID: "a8f953fc-1eac-414d-b93e-e98eaa5aea79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.686213 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-kube-api-access-vdx2b" (OuterVolumeSpecName: "kube-api-access-vdx2b") pod "92eb9688-52c0-4ba4-8a82-3f874d85e2cf" (UID: "92eb9688-52c0-4ba4-8a82-3f874d85e2cf"). InnerVolumeSpecName "kube-api-access-vdx2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.686299 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84707a79-5b88-454b-9e1f-5618515a5623-kube-api-access-d2b7k" (OuterVolumeSpecName: "kube-api-access-d2b7k") pod "84707a79-5b88-454b-9e1f-5618515a5623" (UID: "84707a79-5b88-454b-9e1f-5618515a5623"). InnerVolumeSpecName "kube-api-access-d2b7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.687724 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f953fc-1eac-414d-b93e-e98eaa5aea79-kube-api-access-nmskm" (OuterVolumeSpecName: "kube-api-access-nmskm") pod "a8f953fc-1eac-414d-b93e-e98eaa5aea79" (UID: "a8f953fc-1eac-414d-b93e-e98eaa5aea79"). InnerVolumeSpecName "kube-api-access-nmskm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.690752 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "92eb9688-52c0-4ba4-8a82-3f874d85e2cf" (UID: "92eb9688-52c0-4ba4-8a82-3f874d85e2cf"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.698771 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8f953fc-1eac-414d-b93e-e98eaa5aea79" (UID: "a8f953fc-1eac-414d-b93e-e98eaa5aea79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.740426 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84707a79-5b88-454b-9e1f-5618515a5623" (UID: "84707a79-5b88-454b-9e1f-5618515a5623"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784043 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784099 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2b7k\" (UniqueName: \"kubernetes.io/projected/84707a79-5b88-454b-9e1f-5618515a5623-kube-api-access-d2b7k\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784117 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784127 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdx2b\" (UniqueName: \"kubernetes.io/projected/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-kube-api-access-vdx2b\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784137 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784147 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92eb9688-52c0-4ba4-8a82-3f874d85e2cf-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784157 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84707a79-5b88-454b-9e1f-5618515a5623-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784167 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8f953fc-1eac-414d-b93e-e98eaa5aea79-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:09 crc kubenswrapper[4907]: I1009 19:33:09.784175 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmskm\" (UniqueName: \"kubernetes.io/projected/a8f953fc-1eac-414d-b93e-e98eaa5aea79-kube-api-access-nmskm\") on node \"crc\" DevicePath \"\"" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.153349 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5b4h" event={"ID":"bf99f768-d09e-4105-9150-39b510795216","Type":"ContainerDied","Data":"c2b477642716e6dfe4617b6bc52a5ecf2867e3f197f43b10b07dccf5a7a8669c"} Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.153407 4907 scope.go:117] "RemoveContainer" containerID="c893f2c17fe8f0a0f509fdc2121643ec409472bed6ef6ae2a4ef48a54384bd71" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.153638 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5b4h" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.155168 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" event={"ID":"92eb9688-52c0-4ba4-8a82-3f874d85e2cf","Type":"ContainerDied","Data":"85fec867537a95af74558ebc1290a36e0144a690fc55507e548dc609f10b9b5f"} Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.155205 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6rv64" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.158354 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" event={"ID":"d043494b-98ab-482a-ba53-5f2445d01bea","Type":"ContainerStarted","Data":"f7898977b65dd6291f389401ac9aebbccd3d9f353185c6d83f71cf8339576d58"} Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.158397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" event={"ID":"d043494b-98ab-482a-ba53-5f2445d01bea","Type":"ContainerStarted","Data":"aac42f015640685f3502d5e1980b9a7bc7d9c04db95097c5b1112cf800667d97"} Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.158833 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.161676 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7t9j5" event={"ID":"84707a79-5b88-454b-9e1f-5618515a5623","Type":"ContainerDied","Data":"1baed4ec62510e201e98d4ff1db781e17eaeddbf66808e48c026d445c2c837b8"} Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.161763 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7t9j5" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.167987 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.177239 4907 scope.go:117] "RemoveContainer" containerID="318d3b0311cb780240a299129789e7838c8a7b27e944f7d7a6db9e721c700ae5" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.177662 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v9qxz" event={"ID":"a8f953fc-1eac-414d-b93e-e98eaa5aea79","Type":"ContainerDied","Data":"e38331277e49c1d7e971520863e0ce387869a93b0c99d4d4e144ba8cb30a8912"} Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.177846 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v9qxz" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.188114 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dmrx4" podStartSLOduration=2.188085943 podStartE2EDuration="2.188085943s" podCreationTimestamp="2025-10-09 19:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:33:10.179861712 +0000 UTC m=+275.711829221" watchObservedRunningTime="2025-10-09 19:33:10.188085943 +0000 UTC m=+275.720053432" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.190370 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rddhr" event={"ID":"1d25de17-0079-44bd-9595-ea432cbd0982","Type":"ContainerDied","Data":"3e7ed7c3d28513e08aa51ff5d1039e91f094f051435f10a8c492945b438d6b8b"} Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.190604 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rddhr" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.225323 4907 scope.go:117] "RemoveContainer" containerID="892a482aec51cacc014a29347dcab41fe770812f819db9eda511d9a578a806be" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.244586 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6rv64"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.251505 4907 scope.go:117] "RemoveContainer" containerID="752d4c754c30fbb378d359ec3690e73283c5e0aec792559042ad2d540303decf" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.252671 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6rv64"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.270585 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7t9j5"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.277898 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7t9j5"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.281691 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v9qxz"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.284331 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v9qxz"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.296596 4907 scope.go:117] "RemoveContainer" containerID="e1a1453bbf7df68b944dfea1b164d0076eb52926ebc067ac339cb1b6c1eeb6a0" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.297265 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5b4h"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.310106 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b5b4h"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.315001 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rddhr"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.318035 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rddhr"] Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.319886 4907 scope.go:117] "RemoveContainer" containerID="7e67af814ae925fab1067532e5e0bb977f655caa3a2261f030a900e588db23d0" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.336194 4907 scope.go:117] "RemoveContainer" containerID="f710d4e674080d1e903acd33ebb519996fe9e95b89ce3e7eaeb388be86f9b24b" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.349877 4907 scope.go:117] "RemoveContainer" containerID="7b4cce3853802ee5950c2139ad0a54c386950adc2617c7403e75592e06081e52" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.362018 4907 scope.go:117] "RemoveContainer" containerID="fb58b2c39769a52698638137761aaadf646fdc3eb10b8b0d8c0183f7af484620" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.376265 4907 scope.go:117] "RemoveContainer" containerID="b8adb9a5049ea5e279f5ab4cf514228d2e839ea5efea209cc208b1a242c6bddf" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.401054 4907 scope.go:117] "RemoveContainer" containerID="2b337a9fae11ccc3b7cff8f949d4ba92b7018116607bfddc5bb95b961251d180" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.431425 4907 scope.go:117] "RemoveContainer" containerID="dc02e34de8cc5797ae7c1674131f52a5184ab61aa460d8d262c027af980d1197" Oct 09 19:33:10 crc kubenswrapper[4907]: I1009 19:33:10.446460 4907 scope.go:117] "RemoveContainer" containerID="df622bac24007560de88f1522c4e3ce7e732016908c0ffba919e2b4aef08e8f9" Oct 09 19:33:11 crc kubenswrapper[4907]: I1009 19:33:11.162630 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" path="/var/lib/kubelet/pods/1d25de17-0079-44bd-9595-ea432cbd0982/volumes" Oct 09 19:33:11 crc kubenswrapper[4907]: I1009 19:33:11.163372 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84707a79-5b88-454b-9e1f-5618515a5623" path="/var/lib/kubelet/pods/84707a79-5b88-454b-9e1f-5618515a5623/volumes" Oct 09 19:33:11 crc kubenswrapper[4907]: I1009 19:33:11.164166 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92eb9688-52c0-4ba4-8a82-3f874d85e2cf" path="/var/lib/kubelet/pods/92eb9688-52c0-4ba4-8a82-3f874d85e2cf/volumes" Oct 09 19:33:11 crc kubenswrapper[4907]: I1009 19:33:11.164636 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" path="/var/lib/kubelet/pods/a8f953fc-1eac-414d-b93e-e98eaa5aea79/volumes" Oct 09 19:33:11 crc kubenswrapper[4907]: I1009 19:33:11.165239 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf99f768-d09e-4105-9150-39b510795216" path="/var/lib/kubelet/pods/bf99f768-d09e-4105-9150-39b510795216/volumes" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.493044 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b59r4"] Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.493934 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf99f768-d09e-4105-9150-39b510795216" containerName="extract-utilities" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.493958 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf99f768-d09e-4105-9150-39b510795216" containerName="extract-utilities" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.493980 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84707a79-5b88-454b-9e1f-5618515a5623" containerName="extract-content" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.493991 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="84707a79-5b88-454b-9e1f-5618515a5623" containerName="extract-content" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494009 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" containerName="extract-content" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494021 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" containerName="extract-content" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494038 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf99f768-d09e-4105-9150-39b510795216" containerName="extract-content" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494052 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf99f768-d09e-4105-9150-39b510795216" containerName="extract-content" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494067 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494078 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494095 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84707a79-5b88-454b-9e1f-5618515a5623" containerName="extract-utilities" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494105 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="84707a79-5b88-454b-9e1f-5618515a5623" containerName="extract-utilities" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494122 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494133 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494150 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92eb9688-52c0-4ba4-8a82-3f874d85e2cf" containerName="marketplace-operator" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494163 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eb9688-52c0-4ba4-8a82-3f874d85e2cf" containerName="marketplace-operator" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494173 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" containerName="extract-utilities" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494184 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" containerName="extract-utilities" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494199 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf99f768-d09e-4105-9150-39b510795216" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494210 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf99f768-d09e-4105-9150-39b510795216" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494225 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerName="extract-content" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494235 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerName="extract-content" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494248 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84707a79-5b88-454b-9e1f-5618515a5623" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494259 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="84707a79-5b88-454b-9e1f-5618515a5623" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: E1009 19:33:12.494276 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerName="extract-utilities" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.494291 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerName="extract-utilities" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.495137 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="84707a79-5b88-454b-9e1f-5618515a5623" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.495292 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf99f768-d09e-4105-9150-39b510795216" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.495318 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="92eb9688-52c0-4ba4-8a82-3f874d85e2cf" containerName="marketplace-operator" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.495340 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d25de17-0079-44bd-9595-ea432cbd0982" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.495358 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f953fc-1eac-414d-b93e-e98eaa5aea79" containerName="registry-server" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.502559 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.502652 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b59r4"] Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.505200 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.638109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-utilities\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.638174 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2c8\" (UniqueName: \"kubernetes.io/projected/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-kube-api-access-ql2c8\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.638201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-catalog-content\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.695207 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z7tfr"] Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.696734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.699424 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.712456 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z7tfr"] Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.739488 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-utilities\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.739571 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-catalog-content\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.739598 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2c8\" (UniqueName: \"kubernetes.io/projected/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-kube-api-access-ql2c8\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.740492 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-utilities\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.740507 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-catalog-content\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.762939 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2c8\" (UniqueName: \"kubernetes.io/projected/50afc46f-91ad-47d4-9ef9-03be3cfa2df6-kube-api-access-ql2c8\") pod \"certified-operators-b59r4\" (UID: \"50afc46f-91ad-47d4-9ef9-03be3cfa2df6\") " pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.825879 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.841035 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0c82bc-f62f-4df0-a8d4-630a2124f553-utilities\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.841130 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0c82bc-f62f-4df0-a8d4-630a2124f553-catalog-content\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.841169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xmbx\" (UniqueName: \"kubernetes.io/projected/ae0c82bc-f62f-4df0-a8d4-630a2124f553-kube-api-access-7xmbx\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.942968 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0c82bc-f62f-4df0-a8d4-630a2124f553-catalog-content\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.943449 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xmbx\" (UniqueName: \"kubernetes.io/projected/ae0c82bc-f62f-4df0-a8d4-630a2124f553-kube-api-access-7xmbx\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.943563 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0c82bc-f62f-4df0-a8d4-630a2124f553-utilities\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.943568 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0c82bc-f62f-4df0-a8d4-630a2124f553-catalog-content\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.944106 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0c82bc-f62f-4df0-a8d4-630a2124f553-utilities\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:12 crc kubenswrapper[4907]: I1009 19:33:12.967226 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xmbx\" (UniqueName: \"kubernetes.io/projected/ae0c82bc-f62f-4df0-a8d4-630a2124f553-kube-api-access-7xmbx\") pod \"community-operators-z7tfr\" (UID: \"ae0c82bc-f62f-4df0-a8d4-630a2124f553\") " pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:13 crc kubenswrapper[4907]: I1009 19:33:13.019452 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:13 crc kubenswrapper[4907]: I1009 19:33:13.041620 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b59r4"] Oct 09 19:33:13 crc kubenswrapper[4907]: W1009 19:33:13.061030 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50afc46f_91ad_47d4_9ef9_03be3cfa2df6.slice/crio-e851d59ac9df019579ff53ad82de3dad5573999e851f6ea2e4857e947035882f WatchSource:0}: Error finding container e851d59ac9df019579ff53ad82de3dad5573999e851f6ea2e4857e947035882f: Status 404 returned error can't find the container with id e851d59ac9df019579ff53ad82de3dad5573999e851f6ea2e4857e947035882f Oct 09 19:33:13 crc kubenswrapper[4907]: I1009 19:33:13.214800 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b59r4" event={"ID":"50afc46f-91ad-47d4-9ef9-03be3cfa2df6","Type":"ContainerStarted","Data":"e851d59ac9df019579ff53ad82de3dad5573999e851f6ea2e4857e947035882f"} Oct 09 19:33:13 crc kubenswrapper[4907]: I1009 19:33:13.434766 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z7tfr"] Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.223841 4907 generic.go:334] "Generic (PLEG): container finished" podID="50afc46f-91ad-47d4-9ef9-03be3cfa2df6" containerID="c8c376ff58ee8ca36c6e97c18bbe458495f0010cfedb3a8b1fb949285755e272" exitCode=0 Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.223940 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b59r4" event={"ID":"50afc46f-91ad-47d4-9ef9-03be3cfa2df6","Type":"ContainerDied","Data":"c8c376ff58ee8ca36c6e97c18bbe458495f0010cfedb3a8b1fb949285755e272"} Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.225733 4907 generic.go:334] "Generic (PLEG): container finished" podID="ae0c82bc-f62f-4df0-a8d4-630a2124f553" containerID="f7c5bf7b462fc1b8107df79dc3de9b4c889a591fd16893feb5e43f6786921a30" exitCode=0 Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.225758 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7tfr" event={"ID":"ae0c82bc-f62f-4df0-a8d4-630a2124f553","Type":"ContainerDied","Data":"f7c5bf7b462fc1b8107df79dc3de9b4c889a591fd16893feb5e43f6786921a30"} Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.225774 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7tfr" event={"ID":"ae0c82bc-f62f-4df0-a8d4-630a2124f553","Type":"ContainerStarted","Data":"f6d38af701e6767be9e04b7915b1a02499b310efbd223d27a0d1b80dad497f35"} Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.890633 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zg"] Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.892238 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.895555 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.903901 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zg"] Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.971665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bztg\" (UniqueName: \"kubernetes.io/projected/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-kube-api-access-4bztg\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.971740 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-catalog-content\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:14 crc kubenswrapper[4907]: I1009 19:33:14.971786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-utilities\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.072901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bztg\" (UniqueName: \"kubernetes.io/projected/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-kube-api-access-4bztg\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.072959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-catalog-content\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.073007 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-utilities\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.073558 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-utilities\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.073723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-catalog-content\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.095343 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5fdv4"] Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.096743 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.098978 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.104605 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdv4"] Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.116931 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bztg\" (UniqueName: \"kubernetes.io/projected/4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2-kube-api-access-4bztg\") pod \"redhat-marketplace-bw8zg\" (UID: \"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2\") " pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.173822 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd02e79-a674-49ff-895e-e691c2a42a17-catalog-content\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.173893 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd02e79-a674-49ff-895e-e691c2a42a17-utilities\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.173947 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9t5w\" (UniqueName: \"kubernetes.io/projected/ccd02e79-a674-49ff-895e-e691c2a42a17-kube-api-access-l9t5w\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.225656 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.275512 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd02e79-a674-49ff-895e-e691c2a42a17-utilities\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.275604 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9t5w\" (UniqueName: \"kubernetes.io/projected/ccd02e79-a674-49ff-895e-e691c2a42a17-kube-api-access-l9t5w\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.275710 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd02e79-a674-49ff-895e-e691c2a42a17-catalog-content\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.276209 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd02e79-a674-49ff-895e-e691c2a42a17-utilities\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.276404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd02e79-a674-49ff-895e-e691c2a42a17-catalog-content\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.294611 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9t5w\" (UniqueName: \"kubernetes.io/projected/ccd02e79-a674-49ff-895e-e691c2a42a17-kube-api-access-l9t5w\") pod \"redhat-operators-5fdv4\" (UID: \"ccd02e79-a674-49ff-895e-e691c2a42a17\") " pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:15 crc kubenswrapper[4907]: I1009 19:33:15.447131 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:16 crc kubenswrapper[4907]: I1009 19:33:16.528371 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8zg"] Oct 09 19:33:16 crc kubenswrapper[4907]: I1009 19:33:16.872878 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5fdv4"] Oct 09 19:33:16 crc kubenswrapper[4907]: W1009 19:33:16.881361 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccd02e79_a674_49ff_895e_e691c2a42a17.slice/crio-6eef20778791e31d1204b92d6e9fa13ef8f09289c12ef13540e08ed860842cb5 WatchSource:0}: Error finding container 6eef20778791e31d1204b92d6e9fa13ef8f09289c12ef13540e08ed860842cb5: Status 404 returned error can't find the container with id 6eef20778791e31d1204b92d6e9fa13ef8f09289c12ef13540e08ed860842cb5 Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.246955 4907 generic.go:334] "Generic (PLEG): container finished" podID="50afc46f-91ad-47d4-9ef9-03be3cfa2df6" containerID="4547347489bb52d8ced0229ada8a9b70b599a9719662969be40252459e2838b6" exitCode=0 Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.247876 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b59r4" event={"ID":"50afc46f-91ad-47d4-9ef9-03be3cfa2df6","Type":"ContainerDied","Data":"4547347489bb52d8ced0229ada8a9b70b599a9719662969be40252459e2838b6"} Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.256910 4907 generic.go:334] "Generic (PLEG): container finished" podID="ae0c82bc-f62f-4df0-a8d4-630a2124f553" containerID="cbb85d4ad9e600d7d105499072f7629d9f85735752ad5824bb147fd4d799916e" exitCode=0 Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.256978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7tfr" event={"ID":"ae0c82bc-f62f-4df0-a8d4-630a2124f553","Type":"ContainerDied","Data":"cbb85d4ad9e600d7d105499072f7629d9f85735752ad5824bb147fd4d799916e"} Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.271998 4907 generic.go:334] "Generic (PLEG): container finished" podID="ccd02e79-a674-49ff-895e-e691c2a42a17" containerID="5ae616950fb73569cdb907e219b579af02a439cab8672548ddfc1e77f30b64b8" exitCode=0 Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.272625 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdv4" event={"ID":"ccd02e79-a674-49ff-895e-e691c2a42a17","Type":"ContainerDied","Data":"5ae616950fb73569cdb907e219b579af02a439cab8672548ddfc1e77f30b64b8"} Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.272670 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdv4" event={"ID":"ccd02e79-a674-49ff-895e-e691c2a42a17","Type":"ContainerStarted","Data":"6eef20778791e31d1204b92d6e9fa13ef8f09289c12ef13540e08ed860842cb5"} Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.278033 4907 generic.go:334] "Generic (PLEG): container finished" podID="4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2" containerID="f819d1b38df4d46138463a7f0d318861a71242ce07c3d092f370233ed26bd8db" exitCode=0 Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.278074 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zg" event={"ID":"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2","Type":"ContainerDied","Data":"f819d1b38df4d46138463a7f0d318861a71242ce07c3d092f370233ed26bd8db"} Oct 09 19:33:17 crc kubenswrapper[4907]: I1009 19:33:17.278100 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zg" event={"ID":"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2","Type":"ContainerStarted","Data":"fd17497ccf83aadbd66b313e7d3113c15eb5ed93dfe96064680eb1118e229c55"} Oct 09 19:33:19 crc kubenswrapper[4907]: I1009 19:33:19.296160 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b59r4" event={"ID":"50afc46f-91ad-47d4-9ef9-03be3cfa2df6","Type":"ContainerStarted","Data":"f15af2bed9f394f8c49f4a296cb981203acdfdf7fadcd545f9d53c502530f345"} Oct 09 19:33:19 crc kubenswrapper[4907]: I1009 19:33:19.321550 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b59r4" podStartSLOduration=3.205468049 podStartE2EDuration="7.321532895s" podCreationTimestamp="2025-10-09 19:33:12 +0000 UTC" firstStartedPulling="2025-10-09 19:33:14.231760079 +0000 UTC m=+279.763727608" lastFinishedPulling="2025-10-09 19:33:18.347824925 +0000 UTC m=+283.879792454" observedRunningTime="2025-10-09 19:33:19.318149528 +0000 UTC m=+284.850117017" watchObservedRunningTime="2025-10-09 19:33:19.321532895 +0000 UTC m=+284.853500384" Oct 09 19:33:20 crc kubenswrapper[4907]: I1009 19:33:20.303296 4907 generic.go:334] "Generic (PLEG): container finished" podID="4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2" containerID="c2ac86ebca7950b3f2b784e1fc2ce3492bc4b490a9644924332d1766f076049b" exitCode=0 Oct 09 19:33:20 crc kubenswrapper[4907]: I1009 19:33:20.303551 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zg" event={"ID":"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2","Type":"ContainerDied","Data":"c2ac86ebca7950b3f2b784e1fc2ce3492bc4b490a9644924332d1766f076049b"} Oct 09 19:33:20 crc kubenswrapper[4907]: I1009 19:33:20.306750 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z7tfr" event={"ID":"ae0c82bc-f62f-4df0-a8d4-630a2124f553","Type":"ContainerStarted","Data":"41e4e65347e3a428f6fc34448de67900866b675b5c26e5f6cbc84c1c5b3c6546"} Oct 09 19:33:20 crc kubenswrapper[4907]: I1009 19:33:20.312085 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdv4" event={"ID":"ccd02e79-a674-49ff-895e-e691c2a42a17","Type":"ContainerStarted","Data":"5d33590c0dec1c4c142cebab426a86589e0331973b9a19f4528392b2d2f08cad"} Oct 09 19:33:20 crc kubenswrapper[4907]: I1009 19:33:20.343682 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z7tfr" podStartSLOduration=3.7561173180000003 podStartE2EDuration="8.343663903s" podCreationTimestamp="2025-10-09 19:33:12 +0000 UTC" firstStartedPulling="2025-10-09 19:33:14.229350322 +0000 UTC m=+279.761317811" lastFinishedPulling="2025-10-09 19:33:18.816896897 +0000 UTC m=+284.348864396" observedRunningTime="2025-10-09 19:33:20.340277776 +0000 UTC m=+285.872245295" watchObservedRunningTime="2025-10-09 19:33:20.343663903 +0000 UTC m=+285.875631392" Oct 09 19:33:21 crc kubenswrapper[4907]: I1009 19:33:21.319453 4907 generic.go:334] "Generic (PLEG): container finished" podID="ccd02e79-a674-49ff-895e-e691c2a42a17" containerID="5d33590c0dec1c4c142cebab426a86589e0331973b9a19f4528392b2d2f08cad" exitCode=0 Oct 09 19:33:21 crc kubenswrapper[4907]: I1009 19:33:21.319604 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdv4" event={"ID":"ccd02e79-a674-49ff-895e-e691c2a42a17","Type":"ContainerDied","Data":"5d33590c0dec1c4c142cebab426a86589e0331973b9a19f4528392b2d2f08cad"} Oct 09 19:33:22 crc kubenswrapper[4907]: I1009 19:33:22.328164 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5fdv4" event={"ID":"ccd02e79-a674-49ff-895e-e691c2a42a17","Type":"ContainerStarted","Data":"3c6a26853a7377cf65842ee96ec74de72e6df71c5debfceffe5b37555982c85b"} Oct 09 19:33:22 crc kubenswrapper[4907]: I1009 19:33:22.330816 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8zg" event={"ID":"4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2","Type":"ContainerStarted","Data":"a10663b0398b2724ca93a62fa7875e2797f7f3b3936bb60884a0d97075c190d9"} Oct 09 19:33:22 crc kubenswrapper[4907]: I1009 19:33:22.350373 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5fdv4" podStartSLOduration=2.621510527 podStartE2EDuration="7.350350098s" podCreationTimestamp="2025-10-09 19:33:15 +0000 UTC" firstStartedPulling="2025-10-09 19:33:17.273783296 +0000 UTC m=+282.805750785" lastFinishedPulling="2025-10-09 19:33:22.002622877 +0000 UTC m=+287.534590356" observedRunningTime="2025-10-09 19:33:22.349911026 +0000 UTC m=+287.881878535" watchObservedRunningTime="2025-10-09 19:33:22.350350098 +0000 UTC m=+287.882317607" Oct 09 19:33:22 crc kubenswrapper[4907]: I1009 19:33:22.367890 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bw8zg" podStartSLOduration=4.320491483 podStartE2EDuration="8.367869961s" podCreationTimestamp="2025-10-09 19:33:14 +0000 UTC" firstStartedPulling="2025-10-09 19:33:17.279945968 +0000 UTC m=+282.811913457" lastFinishedPulling="2025-10-09 19:33:21.327324446 +0000 UTC m=+286.859291935" observedRunningTime="2025-10-09 19:33:22.365171471 +0000 UTC m=+287.897138980" watchObservedRunningTime="2025-10-09 19:33:22.367869961 +0000 UTC m=+287.899837450" Oct 09 19:33:22 crc kubenswrapper[4907]: I1009 19:33:22.826219 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:22 crc kubenswrapper[4907]: I1009 19:33:22.826276 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:22 crc kubenswrapper[4907]: I1009 19:33:22.871498 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:23 crc kubenswrapper[4907]: I1009 19:33:23.020153 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:23 crc kubenswrapper[4907]: I1009 19:33:23.020234 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:23 crc kubenswrapper[4907]: I1009 19:33:23.062926 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:25 crc kubenswrapper[4907]: I1009 19:33:25.226756 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:25 crc kubenswrapper[4907]: I1009 19:33:25.228634 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:25 crc kubenswrapper[4907]: I1009 19:33:25.286257 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:25 crc kubenswrapper[4907]: I1009 19:33:25.448404 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:25 crc kubenswrapper[4907]: I1009 19:33:25.448494 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:26 crc kubenswrapper[4907]: I1009 19:33:26.420583 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bw8zg" Oct 09 19:33:26 crc kubenswrapper[4907]: I1009 19:33:26.498569 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5fdv4" podUID="ccd02e79-a674-49ff-895e-e691c2a42a17" containerName="registry-server" probeResult="failure" output=< Oct 09 19:33:26 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Oct 09 19:33:26 crc kubenswrapper[4907]: > Oct 09 19:33:32 crc kubenswrapper[4907]: I1009 19:33:32.877459 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b59r4" Oct 09 19:33:33 crc kubenswrapper[4907]: I1009 19:33:33.061021 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z7tfr" Oct 09 19:33:35 crc kubenswrapper[4907]: I1009 19:33:35.498200 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:33:35 crc kubenswrapper[4907]: I1009 19:33:35.541570 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5fdv4" Oct 09 19:34:06 crc kubenswrapper[4907]: I1009 19:34:06.299510 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:34:06 crc kubenswrapper[4907]: I1009 19:34:06.300206 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:34:36 crc kubenswrapper[4907]: I1009 19:34:36.299587 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:34:36 crc kubenswrapper[4907]: I1009 19:34:36.301991 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.300104 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.301095 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.301167 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.302175 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b36d2556ee836a20add9ae68257132453204781c07c26a6515296da113e1362"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.302276 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://6b36d2556ee836a20add9ae68257132453204781c07c26a6515296da113e1362" gracePeriod=600 Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.995800 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="6b36d2556ee836a20add9ae68257132453204781c07c26a6515296da113e1362" exitCode=0 Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.995902 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"6b36d2556ee836a20add9ae68257132453204781c07c26a6515296da113e1362"} Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.996338 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"20040c38bb94f5f105635ae1ef3d872533313f11c565c46967e83a901a8d6060"} Oct 09 19:35:06 crc kubenswrapper[4907]: I1009 19:35:06.996373 4907 scope.go:117] "RemoveContainer" containerID="796b4498348e78e11c8dda4ae58c397dee04d60335891243436efe172e5e0b61" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.694606 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-62knw"] Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.697032 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.721589 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-62knw"] Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.835020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f94e929-a1a2-4d07-ab99-110d2d17910e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.835705 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4dh\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-kube-api-access-kf4dh\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.835830 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f94e929-a1a2-4d07-ab99-110d2d17910e-trusted-ca\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.836035 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-registry-tls\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.836169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f94e929-a1a2-4d07-ab99-110d2d17910e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.836293 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-bound-sa-token\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.836417 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.836569 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f94e929-a1a2-4d07-ab99-110d2d17910e-registry-certificates\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.865848 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.938582 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f94e929-a1a2-4d07-ab99-110d2d17910e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.938796 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-bound-sa-token\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.938915 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f94e929-a1a2-4d07-ab99-110d2d17910e-registry-certificates\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.939085 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f94e929-a1a2-4d07-ab99-110d2d17910e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.939195 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4dh\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-kube-api-access-kf4dh\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.939292 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f94e929-a1a2-4d07-ab99-110d2d17910e-trusted-ca\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.939389 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-registry-tls\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.939884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f94e929-a1a2-4d07-ab99-110d2d17910e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.941650 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f94e929-a1a2-4d07-ab99-110d2d17910e-registry-certificates\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.941938 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f94e929-a1a2-4d07-ab99-110d2d17910e-trusted-ca\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.947420 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f94e929-a1a2-4d07-ab99-110d2d17910e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.947948 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-registry-tls\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.959273 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4dh\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-kube-api-access-kf4dh\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:34 crc kubenswrapper[4907]: I1009 19:36:34.962845 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f94e929-a1a2-4d07-ab99-110d2d17910e-bound-sa-token\") pod \"image-registry-66df7c8f76-62knw\" (UID: \"7f94e929-a1a2-4d07-ab99-110d2d17910e\") " pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:35 crc kubenswrapper[4907]: I1009 19:36:35.015817 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:35 crc kubenswrapper[4907]: I1009 19:36:35.234978 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-62knw"] Oct 09 19:36:35 crc kubenswrapper[4907]: W1009 19:36:35.243556 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f94e929_a1a2_4d07_ab99_110d2d17910e.slice/crio-9080b1164a3b969cef7590aa4e5f781d063e7370a01699867ea986e164b73306 WatchSource:0}: Error finding container 9080b1164a3b969cef7590aa4e5f781d063e7370a01699867ea986e164b73306: Status 404 returned error can't find the container with id 9080b1164a3b969cef7590aa4e5f781d063e7370a01699867ea986e164b73306 Oct 09 19:36:35 crc kubenswrapper[4907]: I1009 19:36:35.621851 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-62knw" event={"ID":"7f94e929-a1a2-4d07-ab99-110d2d17910e","Type":"ContainerStarted","Data":"ce36b700a5b9c4e5b1ea10e727832395d8699e34d53782e1229faea262d901d2"} Oct 09 19:36:35 crc kubenswrapper[4907]: I1009 19:36:35.621986 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-62knw" event={"ID":"7f94e929-a1a2-4d07-ab99-110d2d17910e","Type":"ContainerStarted","Data":"9080b1164a3b969cef7590aa4e5f781d063e7370a01699867ea986e164b73306"} Oct 09 19:36:35 crc kubenswrapper[4907]: I1009 19:36:35.622072 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:35 crc kubenswrapper[4907]: I1009 19:36:35.647403 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-62knw" podStartSLOduration=1.647374965 podStartE2EDuration="1.647374965s" podCreationTimestamp="2025-10-09 19:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:36:35.646071991 +0000 UTC m=+481.178039540" watchObservedRunningTime="2025-10-09 19:36:35.647374965 +0000 UTC m=+481.179342494" Oct 09 19:36:55 crc kubenswrapper[4907]: I1009 19:36:55.027557 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-62knw" Oct 09 19:36:55 crc kubenswrapper[4907]: I1009 19:36:55.106023 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mdh4s"] Oct 09 19:37:06 crc kubenswrapper[4907]: I1009 19:37:06.299399 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:37:06 crc kubenswrapper[4907]: I1009 19:37:06.300281 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.153917 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" podUID="2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" containerName="registry" containerID="cri-o://560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87" gracePeriod=30 Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.588636 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.620889 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-certificates\") pod \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.620968 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8mjk\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-kube-api-access-w8mjk\") pod \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.621034 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-ca-trust-extracted\") pod \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.621118 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-trusted-ca\") pod \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.621154 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-tls\") pod \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.622736 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.622907 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.623613 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.624204 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-bound-sa-token\") pod \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.624244 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-installation-pull-secrets\") pod \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\" (UID: \"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e\") " Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.624806 4907 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.624831 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.629338 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.629637 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.633979 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-kube-api-access-w8mjk" (OuterVolumeSpecName: "kube-api-access-w8mjk") pod "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e"). InnerVolumeSpecName "kube-api-access-w8mjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.634085 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.635661 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.663439 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" (UID: "2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.726067 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8mjk\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-kube-api-access-w8mjk\") on node \"crc\" DevicePath \"\"" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.726131 4907 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.726156 4907 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.726176 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.726195 4907 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.935604 4907 generic.go:334] "Generic (PLEG): container finished" podID="2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" containerID="560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87" exitCode=0 Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.935670 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" event={"ID":"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e","Type":"ContainerDied","Data":"560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87"} Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.935705 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.935731 4907 scope.go:117] "RemoveContainer" containerID="560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.935712 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mdh4s" event={"ID":"2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e","Type":"ContainerDied","Data":"17bfddbb1c569a9aa47780758d22f9b650cf6e409eed4767c0bbc1d4f89face4"} Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.970975 4907 scope.go:117] "RemoveContainer" containerID="560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87" Oct 09 19:37:20 crc kubenswrapper[4907]: E1009 19:37:20.972054 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87\": container with ID starting with 560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87 not found: ID does not exist" containerID="560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.972132 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87"} err="failed to get container status \"560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87\": rpc error: code = NotFound desc = could not find container \"560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87\": container with ID starting with 560c07b0c93e625155bd5e3e23ed75171979ab38a2ad1bdd1ae1f4637dfe7a87 not found: ID does not exist" Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.990264 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mdh4s"] Oct 09 19:37:20 crc kubenswrapper[4907]: I1009 19:37:20.998169 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mdh4s"] Oct 09 19:37:21 crc kubenswrapper[4907]: I1009 19:37:21.163683 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" path="/var/lib/kubelet/pods/2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e/volumes" Oct 09 19:37:36 crc kubenswrapper[4907]: I1009 19:37:36.299770 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:37:36 crc kubenswrapper[4907]: I1009 19:37:36.300178 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:38:06 crc kubenswrapper[4907]: I1009 19:38:06.299454 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:38:06 crc kubenswrapper[4907]: I1009 19:38:06.299974 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:38:06 crc kubenswrapper[4907]: I1009 19:38:06.300018 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:38:06 crc kubenswrapper[4907]: I1009 19:38:06.300392 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20040c38bb94f5f105635ae1ef3d872533313f11c565c46967e83a901a8d6060"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 19:38:06 crc kubenswrapper[4907]: I1009 19:38:06.300440 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://20040c38bb94f5f105635ae1ef3d872533313f11c565c46967e83a901a8d6060" gracePeriod=600 Oct 09 19:38:07 crc kubenswrapper[4907]: I1009 19:38:07.264411 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="20040c38bb94f5f105635ae1ef3d872533313f11c565c46967e83a901a8d6060" exitCode=0 Oct 09 19:38:07 crc kubenswrapper[4907]: I1009 19:38:07.264492 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"20040c38bb94f5f105635ae1ef3d872533313f11c565c46967e83a901a8d6060"} Oct 09 19:38:07 crc kubenswrapper[4907]: I1009 19:38:07.265397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"9652a7dfb693b946f43ed7007125b8bc1aa6768f8074819278bd9dc415f2d69d"} Oct 09 19:38:07 crc kubenswrapper[4907]: I1009 19:38:07.265448 4907 scope.go:117] "RemoveContainer" containerID="6b36d2556ee836a20add9ae68257132453204781c07c26a6515296da113e1362" Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.784100 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr"] Oct 09 19:38:36 crc kubenswrapper[4907]: E1009 19:38:36.784814 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" containerName="registry" Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.784827 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" containerName="registry" Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.784921 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ec8b0ec-5ec8-4c8a-97f0-64107dcb229e" containerName="registry" Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.785647 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.790058 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.800989 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr"] Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.923891 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.923951 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmqpv\" (UniqueName: \"kubernetes.io/projected/76767f10-290e-430e-890f-cd5e6769c46e-kube-api-access-fmqpv\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:36 crc kubenswrapper[4907]: I1009 19:38:36.923973 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.024844 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.024920 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqpv\" (UniqueName: \"kubernetes.io/projected/76767f10-290e-430e-890f-cd5e6769c46e-kube-api-access-fmqpv\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.024949 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.025508 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.025723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.045752 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmqpv\" (UniqueName: \"kubernetes.io/projected/76767f10-290e-430e-890f-cd5e6769c46e-kube-api-access-fmqpv\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.144871 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.376336 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr"] Oct 09 19:38:37 crc kubenswrapper[4907]: I1009 19:38:37.484802 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" event={"ID":"76767f10-290e-430e-890f-cd5e6769c46e","Type":"ContainerStarted","Data":"eebf8497acbd0ebf4620d46cd8cc028f889a36234ec948134cdf57c5baa506a8"} Oct 09 19:38:38 crc kubenswrapper[4907]: I1009 19:38:38.493366 4907 generic.go:334] "Generic (PLEG): container finished" podID="76767f10-290e-430e-890f-cd5e6769c46e" containerID="45512c0eb0eba7e0fd2e066a8cd2929e6d412cf385d6588a1eda2745f2234824" exitCode=0 Oct 09 19:38:38 crc kubenswrapper[4907]: I1009 19:38:38.493734 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" event={"ID":"76767f10-290e-430e-890f-cd5e6769c46e","Type":"ContainerDied","Data":"45512c0eb0eba7e0fd2e066a8cd2929e6d412cf385d6588a1eda2745f2234824"} Oct 09 19:38:38 crc kubenswrapper[4907]: I1009 19:38:38.496969 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 19:38:41 crc kubenswrapper[4907]: I1009 19:38:41.523218 4907 generic.go:334] "Generic (PLEG): container finished" podID="76767f10-290e-430e-890f-cd5e6769c46e" containerID="090716ca6e2f829abcdac4c0f31ee3ef72c992be4d9737b573b1e1753a182079" exitCode=0 Oct 09 19:38:41 crc kubenswrapper[4907]: I1009 19:38:41.523347 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" event={"ID":"76767f10-290e-430e-890f-cd5e6769c46e","Type":"ContainerDied","Data":"090716ca6e2f829abcdac4c0f31ee3ef72c992be4d9737b573b1e1753a182079"} Oct 09 19:38:42 crc kubenswrapper[4907]: I1009 19:38:42.535134 4907 generic.go:334] "Generic (PLEG): container finished" podID="76767f10-290e-430e-890f-cd5e6769c46e" containerID="e642167c008126fc9386f20e61725f00650ec3ccb05e820d4505b7764582fdef" exitCode=0 Oct 09 19:38:42 crc kubenswrapper[4907]: I1009 19:38:42.535190 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" event={"ID":"76767f10-290e-430e-890f-cd5e6769c46e","Type":"ContainerDied","Data":"e642167c008126fc9386f20e61725f00650ec3ccb05e820d4505b7764582fdef"} Oct 09 19:38:43 crc kubenswrapper[4907]: I1009 19:38:43.848831 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:43 crc kubenswrapper[4907]: I1009 19:38:43.926380 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-util\") pod \"76767f10-290e-430e-890f-cd5e6769c46e\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " Oct 09 19:38:43 crc kubenswrapper[4907]: I1009 19:38:43.926428 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-bundle\") pod \"76767f10-290e-430e-890f-cd5e6769c46e\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " Oct 09 19:38:43 crc kubenswrapper[4907]: I1009 19:38:43.926498 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmqpv\" (UniqueName: \"kubernetes.io/projected/76767f10-290e-430e-890f-cd5e6769c46e-kube-api-access-fmqpv\") pod \"76767f10-290e-430e-890f-cd5e6769c46e\" (UID: \"76767f10-290e-430e-890f-cd5e6769c46e\") " Oct 09 19:38:43 crc kubenswrapper[4907]: I1009 19:38:43.929651 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-bundle" (OuterVolumeSpecName: "bundle") pod "76767f10-290e-430e-890f-cd5e6769c46e" (UID: "76767f10-290e-430e-890f-cd5e6769c46e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:38:43 crc kubenswrapper[4907]: I1009 19:38:43.935087 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76767f10-290e-430e-890f-cd5e6769c46e-kube-api-access-fmqpv" (OuterVolumeSpecName: "kube-api-access-fmqpv") pod "76767f10-290e-430e-890f-cd5e6769c46e" (UID: "76767f10-290e-430e-890f-cd5e6769c46e"). InnerVolumeSpecName "kube-api-access-fmqpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:38:43 crc kubenswrapper[4907]: I1009 19:38:43.943013 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-util" (OuterVolumeSpecName: "util") pod "76767f10-290e-430e-890f-cd5e6769c46e" (UID: "76767f10-290e-430e-890f-cd5e6769c46e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:38:44 crc kubenswrapper[4907]: I1009 19:38:44.029764 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:44 crc kubenswrapper[4907]: I1009 19:38:44.029824 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmqpv\" (UniqueName: \"kubernetes.io/projected/76767f10-290e-430e-890f-cd5e6769c46e-kube-api-access-fmqpv\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:44 crc kubenswrapper[4907]: I1009 19:38:44.029848 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76767f10-290e-430e-890f-cd5e6769c46e-util\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:44 crc kubenswrapper[4907]: I1009 19:38:44.551310 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" event={"ID":"76767f10-290e-430e-890f-cd5e6769c46e","Type":"ContainerDied","Data":"eebf8497acbd0ebf4620d46cd8cc028f889a36234ec948134cdf57c5baa506a8"} Oct 09 19:38:44 crc kubenswrapper[4907]: I1009 19:38:44.551347 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr" Oct 09 19:38:44 crc kubenswrapper[4907]: I1009 19:38:44.551359 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eebf8497acbd0ebf4620d46cd8cc028f889a36234ec948134cdf57c5baa506a8" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.046431 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t8m7t"] Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.047372 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovn-controller" containerID="cri-o://96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f" gracePeriod=30 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.047495 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="nbdb" containerID="cri-o://f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d" gracePeriod=30 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.047545 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="northd" containerID="cri-o://7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb" gracePeriod=30 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.047574 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovn-acl-logging" containerID="cri-o://91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9" gracePeriod=30 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.047553 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kube-rbac-proxy-node" containerID="cri-o://a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567" gracePeriod=30 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.047687 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="sbdb" containerID="cri-o://2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702" gracePeriod=30 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.047508 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745" gracePeriod=30 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.135230 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" containerID="cri-o://2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" gracePeriod=30 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.421330 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/3.log" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.423430 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovn-acl-logging/0.log" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.423851 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovn-controller/0.log" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.424182 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.504600 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6rp7n"] Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.504838 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kube-rbac-proxy-node" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.504857 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kube-rbac-proxy-node" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.504872 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.504880 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.504892 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.504900 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.504910 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76767f10-290e-430e-890f-cd5e6769c46e" containerName="pull" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.504918 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="76767f10-290e-430e-890f-cd5e6769c46e" containerName="pull" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.504930 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kubecfg-setup" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.504939 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kubecfg-setup" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.504947 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.504954 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.504962 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76767f10-290e-430e-890f-cd5e6769c46e" containerName="util" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.504984 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="76767f10-290e-430e-890f-cd5e6769c46e" containerName="util" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.504998 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76767f10-290e-430e-890f-cd5e6769c46e" containerName="extract" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505007 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="76767f10-290e-430e-890f-cd5e6769c46e" containerName="extract" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.505018 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="northd" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505025 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="northd" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.505036 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="sbdb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505044 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="sbdb" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.505055 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="nbdb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505062 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="nbdb" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.505072 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovn-acl-logging" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505079 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovn-acl-logging" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.505089 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505097 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.505108 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505119 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.505133 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovn-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505141 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovn-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505256 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovn-acl-logging" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505280 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505290 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovn-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505301 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="sbdb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505311 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="nbdb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505320 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505328 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505337 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kube-rbac-proxy-node" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505347 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505356 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="76767f10-290e-430e-890f-cd5e6769c46e" containerName="extract" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505364 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="northd" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505374 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.505503 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505513 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.505611 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerName="ovnkube-controller" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.507438 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519653 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-node-log\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519701 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-openvswitch\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519744 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-etc-openvswitch\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-node-log" (OuterVolumeSpecName: "node-log") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519797 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-log-socket\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519803 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519823 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519844 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-var-lib-openvswitch\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519864 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-log-socket" (OuterVolumeSpecName: "log-socket") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519884 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8n28\" (UniqueName: \"kubernetes.io/projected/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-kube-api-access-p8n28\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519913 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-netns\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519956 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-ovn\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519944 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519986 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.519987 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-bin\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520056 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-ovn-kubernetes\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520101 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-config\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520055 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520031 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520135 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-systemd\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520164 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-var-lib-cni-networks-ovn-kubernetes\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520100 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520238 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-netd\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520295 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-env-overrides\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520319 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520341 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-script-lib\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520370 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-systemd-units\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520370 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520399 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-slash\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520431 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovn-node-metrics-cert\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520457 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-kubelet\") pod \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\" (UID: \"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631\") " Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520673 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520761 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520783 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-slash" (OuterVolumeSpecName: "host-slash") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.520954 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521125 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521513 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521533 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521547 4907 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521557 4907 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-slash\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521567 4907 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521577 4907 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-node-log\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521587 4907 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521597 4907 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521606 4907 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-log-socket\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521616 4907 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521629 4907 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521640 4907 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521651 4907 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521661 4907 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521672 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521682 4907 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.521693 4907 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.525307 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-kube-api-access-p8n28" (OuterVolumeSpecName: "kube-api-access-p8n28") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "kube-api-access-p8n28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.528297 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.531778 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" (UID: "85e063f4-3eb6-4502-bf2a-b7e8b0dd7631"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.575383 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovnkube-controller/3.log" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.597480 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovn-acl-logging/0.log" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598265 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t8m7t_85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/ovn-controller/0.log" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598752 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" exitCode=0 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598803 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702" exitCode=0 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598816 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d" exitCode=0 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598825 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb" exitCode=0 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598834 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745" exitCode=0 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598860 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567" exitCode=0 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598870 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9" exitCode=143 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598880 4907 generic.go:334] "Generic (PLEG): container finished" podID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" containerID="96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f" exitCode=143 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.598957 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599013 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599031 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599062 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599079 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599095 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599119 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599136 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599144 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599152 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599160 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599168 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599176 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599184 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599191 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599201 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599214 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599223 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599231 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599282 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599293 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599301 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599309 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599320 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599329 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599337 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599348 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599361 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599370 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599378 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599387 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599395 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599403 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599414 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599422 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599429 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599437 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599447 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" event={"ID":"85e063f4-3eb6-4502-bf2a-b7e8b0dd7631","Type":"ContainerDied","Data":"d762407dbdb1d5ce09cb35464d11a7924759dbf47c23a6e647dc36bdc9569405"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599525 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599540 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599548 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599556 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599564 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599571 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599589 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599598 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599608 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599621 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.599646 4907 scope.go:117] "RemoveContainer" containerID="2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.601846 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t8m7t" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.623971 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-kubelet\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624016 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovnkube-config\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624038 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-node-log\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624057 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovnkube-script-lib\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624079 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-var-lib-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624097 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-systemd-units\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624113 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-cni-bin\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624127 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-run-netns\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624143 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624161 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-ovn\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624189 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624212 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-systemd\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624229 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-etc-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624251 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624267 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovn-node-metrics-cert\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624289 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-env-overrides\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624309 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-slash\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624325 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-cni-netd\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-log-socket\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624356 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z84gx\" (UniqueName: \"kubernetes.io/projected/9719295f-5529-4df0-84a3-7ffdc5dd9be1-kube-api-access-z84gx\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624386 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624397 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8n28\" (UniqueName: \"kubernetes.io/projected/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-kube-api-access-p8n28\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.624406 4907 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.631963 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/2.log" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.639863 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/1.log" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.639921 4907 generic.go:334] "Generic (PLEG): container finished" podID="64344fcc-f9f2-424f-a32b-44927641b614" containerID="9a676382e3b8fb157627fb4d13edff66f1e877f4c38457dd35387965f237f3df" exitCode=2 Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.639958 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hns2h" event={"ID":"64344fcc-f9f2-424f-a32b-44927641b614","Type":"ContainerDied","Data":"9a676382e3b8fb157627fb4d13edff66f1e877f4c38457dd35387965f237f3df"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.639983 4907 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d"} Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.640409 4907 scope.go:117] "RemoveContainer" containerID="9a676382e3b8fb157627fb4d13edff66f1e877f4c38457dd35387965f237f3df" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.640592 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-hns2h_openshift-multus(64344fcc-f9f2-424f-a32b-44927641b614)\"" pod="openshift-multus/multus-hns2h" podUID="64344fcc-f9f2-424f-a32b-44927641b614" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.648006 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.676740 4907 scope.go:117] "RemoveContainer" containerID="2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.692191 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t8m7t"] Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.703930 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t8m7t"] Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.705304 4907 scope.go:117] "RemoveContainer" containerID="f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.718873 4907 scope.go:117] "RemoveContainer" containerID="7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.725867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-slash\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.725935 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-cni-netd\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.725960 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-log-socket\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.725976 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z84gx\" (UniqueName: \"kubernetes.io/projected/9719295f-5529-4df0-84a3-7ffdc5dd9be1-kube-api-access-z84gx\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.725996 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-kubelet\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovnkube-config\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726032 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-node-log\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726057 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovnkube-script-lib\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-var-lib-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726100 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-systemd-units\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726119 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-cni-bin\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726134 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-run-netns\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726149 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726168 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-ovn\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726193 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-systemd\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726232 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-etc-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726256 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovn-node-metrics-cert\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-env-overrides\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726779 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-kubelet\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726929 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-cni-netd\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726947 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-env-overrides\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726975 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-log-socket\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.726992 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-var-lib-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727012 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-slash\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727025 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-systemd-units\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727044 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727057 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-systemd\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727078 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-cni-bin\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727089 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-etc-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727111 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-run-netns\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727125 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727146 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-openvswitch\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727181 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-run-ovn\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727169 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9719295f-5529-4df0-84a3-7ffdc5dd9be1-node-log\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.727872 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovnkube-config\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.728114 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovnkube-script-lib\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.732961 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9719295f-5529-4df0-84a3-7ffdc5dd9be1-ovn-node-metrics-cert\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.738031 4907 scope.go:117] "RemoveContainer" containerID="eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.744198 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z84gx\" (UniqueName: \"kubernetes.io/projected/9719295f-5529-4df0-84a3-7ffdc5dd9be1-kube-api-access-z84gx\") pod \"ovnkube-node-6rp7n\" (UID: \"9719295f-5529-4df0-84a3-7ffdc5dd9be1\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.757005 4907 scope.go:117] "RemoveContainer" containerID="a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.780156 4907 scope.go:117] "RemoveContainer" containerID="91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.795363 4907 scope.go:117] "RemoveContainer" containerID="96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.818703 4907 scope.go:117] "RemoveContainer" containerID="978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.820080 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.846177 4907 scope.go:117] "RemoveContainer" containerID="2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.846838 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": container with ID starting with 2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec not found: ID does not exist" containerID="2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.846972 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} err="failed to get container status \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": rpc error: code = NotFound desc = could not find container \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": container with ID starting with 2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.847082 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.850739 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": container with ID starting with e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a not found: ID does not exist" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.850982 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} err="failed to get container status \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": rpc error: code = NotFound desc = could not find container \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": container with ID starting with e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.851093 4907 scope.go:117] "RemoveContainer" containerID="2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.855189 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": container with ID starting with 2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702 not found: ID does not exist" containerID="2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.855256 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} err="failed to get container status \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": rpc error: code = NotFound desc = could not find container \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": container with ID starting with 2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.855288 4907 scope.go:117] "RemoveContainer" containerID="f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.855652 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": container with ID starting with f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d not found: ID does not exist" containerID="f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.855761 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} err="failed to get container status \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": rpc error: code = NotFound desc = could not find container \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": container with ID starting with f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.855870 4907 scope.go:117] "RemoveContainer" containerID="7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.857332 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": container with ID starting with 7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb not found: ID does not exist" containerID="7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.857361 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} err="failed to get container status \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": rpc error: code = NotFound desc = could not find container \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": container with ID starting with 7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.857381 4907 scope.go:117] "RemoveContainer" containerID="eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.858568 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": container with ID starting with eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745 not found: ID does not exist" containerID="eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.858610 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} err="failed to get container status \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": rpc error: code = NotFound desc = could not find container \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": container with ID starting with eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.858637 4907 scope.go:117] "RemoveContainer" containerID="a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.859950 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": container with ID starting with a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567 not found: ID does not exist" containerID="a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.859975 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} err="failed to get container status \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": rpc error: code = NotFound desc = could not find container \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": container with ID starting with a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.859990 4907 scope.go:117] "RemoveContainer" containerID="91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.860234 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": container with ID starting with 91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9 not found: ID does not exist" containerID="91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.860253 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} err="failed to get container status \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": rpc error: code = NotFound desc = could not find container \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": container with ID starting with 91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.860266 4907 scope.go:117] "RemoveContainer" containerID="96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.860589 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": container with ID starting with 96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f not found: ID does not exist" containerID="96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.860612 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} err="failed to get container status \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": rpc error: code = NotFound desc = could not find container \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": container with ID starting with 96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.860628 4907 scope.go:117] "RemoveContainer" containerID="978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9" Oct 09 19:38:48 crc kubenswrapper[4907]: E1009 19:38:48.860924 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": container with ID starting with 978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9 not found: ID does not exist" containerID="978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.860984 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} err="failed to get container status \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": rpc error: code = NotFound desc = could not find container \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": container with ID starting with 978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.861029 4907 scope.go:117] "RemoveContainer" containerID="2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.861452 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} err="failed to get container status \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": rpc error: code = NotFound desc = could not find container \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": container with ID starting with 2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.861491 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.861965 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} err="failed to get container status \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": rpc error: code = NotFound desc = could not find container \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": container with ID starting with e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.861985 4907 scope.go:117] "RemoveContainer" containerID="2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.862228 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} err="failed to get container status \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": rpc error: code = NotFound desc = could not find container \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": container with ID starting with 2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.862257 4907 scope.go:117] "RemoveContainer" containerID="f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.863423 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} err="failed to get container status \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": rpc error: code = NotFound desc = could not find container \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": container with ID starting with f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.863448 4907 scope.go:117] "RemoveContainer" containerID="7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.864288 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} err="failed to get container status \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": rpc error: code = NotFound desc = could not find container \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": container with ID starting with 7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.864312 4907 scope.go:117] "RemoveContainer" containerID="eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.864503 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} err="failed to get container status \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": rpc error: code = NotFound desc = could not find container \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": container with ID starting with eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.864522 4907 scope.go:117] "RemoveContainer" containerID="a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.864812 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} err="failed to get container status \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": rpc error: code = NotFound desc = could not find container \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": container with ID starting with a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.864832 4907 scope.go:117] "RemoveContainer" containerID="91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.865074 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} err="failed to get container status \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": rpc error: code = NotFound desc = could not find container \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": container with ID starting with 91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.865093 4907 scope.go:117] "RemoveContainer" containerID="96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.865335 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} err="failed to get container status \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": rpc error: code = NotFound desc = could not find container \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": container with ID starting with 96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.865354 4907 scope.go:117] "RemoveContainer" containerID="978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.865620 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} err="failed to get container status \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": rpc error: code = NotFound desc = could not find container \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": container with ID starting with 978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.865640 4907 scope.go:117] "RemoveContainer" containerID="2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.865892 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} err="failed to get container status \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": rpc error: code = NotFound desc = could not find container \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": container with ID starting with 2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.865910 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.866163 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} err="failed to get container status \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": rpc error: code = NotFound desc = could not find container \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": container with ID starting with e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.866182 4907 scope.go:117] "RemoveContainer" containerID="2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.866478 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} err="failed to get container status \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": rpc error: code = NotFound desc = could not find container \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": container with ID starting with 2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.866500 4907 scope.go:117] "RemoveContainer" containerID="f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.866771 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} err="failed to get container status \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": rpc error: code = NotFound desc = could not find container \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": container with ID starting with f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.866790 4907 scope.go:117] "RemoveContainer" containerID="7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.867067 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} err="failed to get container status \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": rpc error: code = NotFound desc = could not find container \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": container with ID starting with 7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.867087 4907 scope.go:117] "RemoveContainer" containerID="eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.867354 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} err="failed to get container status \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": rpc error: code = NotFound desc = could not find container \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": container with ID starting with eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.867373 4907 scope.go:117] "RemoveContainer" containerID="a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.867731 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} err="failed to get container status \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": rpc error: code = NotFound desc = could not find container \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": container with ID starting with a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.867751 4907 scope.go:117] "RemoveContainer" containerID="91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.873539 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} err="failed to get container status \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": rpc error: code = NotFound desc = could not find container \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": container with ID starting with 91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.873562 4907 scope.go:117] "RemoveContainer" containerID="96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.874792 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} err="failed to get container status \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": rpc error: code = NotFound desc = could not find container \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": container with ID starting with 96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.874846 4907 scope.go:117] "RemoveContainer" containerID="978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.875255 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} err="failed to get container status \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": rpc error: code = NotFound desc = could not find container \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": container with ID starting with 978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.875272 4907 scope.go:117] "RemoveContainer" containerID="2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.877853 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} err="failed to get container status \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": rpc error: code = NotFound desc = could not find container \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": container with ID starting with 2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.877903 4907 scope.go:117] "RemoveContainer" containerID="e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.878229 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a"} err="failed to get container status \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": rpc error: code = NotFound desc = could not find container \"e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a\": container with ID starting with e426e873bb83150acd6dd2bcc26b272895541121df8e81d78f33a967380a7e7a not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.878273 4907 scope.go:117] "RemoveContainer" containerID="2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.878556 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702"} err="failed to get container status \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": rpc error: code = NotFound desc = could not find container \"2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702\": container with ID starting with 2ddc2731009f823b58aae20671537cc2a7588c59147f1cc9b323f4ac56bbc702 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.878577 4907 scope.go:117] "RemoveContainer" containerID="f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.879109 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d"} err="failed to get container status \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": rpc error: code = NotFound desc = could not find container \"f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d\": container with ID starting with f435083f59c3082850b4289a02fb782af5d7a3b15f1964701e0373e7e889146d not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.879128 4907 scope.go:117] "RemoveContainer" containerID="7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.879739 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb"} err="failed to get container status \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": rpc error: code = NotFound desc = could not find container \"7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb\": container with ID starting with 7f9925cc1a639656c7f646c14cd8258afaa060c1fab1ef22794eedd170d5d1fb not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.879787 4907 scope.go:117] "RemoveContainer" containerID="eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.880130 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745"} err="failed to get container status \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": rpc error: code = NotFound desc = could not find container \"eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745\": container with ID starting with eac99142f79aeb4fd7384ee3fee18121366a17929e9ee12b41e87d3e49223745 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.880157 4907 scope.go:117] "RemoveContainer" containerID="a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.880927 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567"} err="failed to get container status \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": rpc error: code = NotFound desc = could not find container \"a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567\": container with ID starting with a81826207f71c6e1989e28222262cf9665feb037e658b23bcbbdcee1d1590567 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.880949 4907 scope.go:117] "RemoveContainer" containerID="91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.881574 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9"} err="failed to get container status \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": rpc error: code = NotFound desc = could not find container \"91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9\": container with ID starting with 91f5700a42afb2edc035e33b849469cd910c77afeaac04b1952d593906d12ac9 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.881601 4907 scope.go:117] "RemoveContainer" containerID="96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.881849 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f"} err="failed to get container status \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": rpc error: code = NotFound desc = could not find container \"96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f\": container with ID starting with 96025df1b83688ebcddf3ae1be44568a8ffbf98c35c0d7f06760ca1b33d7480f not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.881867 4907 scope.go:117] "RemoveContainer" containerID="978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.882060 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9"} err="failed to get container status \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": rpc error: code = NotFound desc = could not find container \"978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9\": container with ID starting with 978601d56b285b76dc57f440a0b819d1994393f870cdf24da327f268132eb2c9 not found: ID does not exist" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.882081 4907 scope.go:117] "RemoveContainer" containerID="2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec" Oct 09 19:38:48 crc kubenswrapper[4907]: I1009 19:38:48.882895 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec"} err="failed to get container status \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": rpc error: code = NotFound desc = could not find container \"2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec\": container with ID starting with 2e38a12b084cc5ef6b7257f233878f45f4e26caea29e445fc1897b643039e6ec not found: ID does not exist" Oct 09 19:38:49 crc kubenswrapper[4907]: I1009 19:38:49.158308 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e063f4-3eb6-4502-bf2a-b7e8b0dd7631" path="/var/lib/kubelet/pods/85e063f4-3eb6-4502-bf2a-b7e8b0dd7631/volumes" Oct 09 19:38:49 crc kubenswrapper[4907]: I1009 19:38:49.646094 4907 generic.go:334] "Generic (PLEG): container finished" podID="9719295f-5529-4df0-84a3-7ffdc5dd9be1" containerID="71f257e40f8dd9f21112afd1bebce3176299e221e8cba82af45f571f5ce0064b" exitCode=0 Oct 09 19:38:49 crc kubenswrapper[4907]: I1009 19:38:49.646155 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerDied","Data":"71f257e40f8dd9f21112afd1bebce3176299e221e8cba82af45f571f5ce0064b"} Oct 09 19:38:49 crc kubenswrapper[4907]: I1009 19:38:49.646183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"ef6391a9faa2bca881c66b62303d46f7feea9dce1058101d2c198dca80f0d9d0"} Oct 09 19:38:50 crc kubenswrapper[4907]: I1009 19:38:50.697671 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"912c8c89b7cf7944e9799df966a1e7eb8efbe9fc6cb7412989cecb6a2b7a64ce"} Oct 09 19:38:50 crc kubenswrapper[4907]: I1009 19:38:50.698643 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"023fbc0885ffcd9f09ac527e411fc85cf3b3786534b51d3d08e635ab78f18e88"} Oct 09 19:38:50 crc kubenswrapper[4907]: I1009 19:38:50.698658 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"c848c68f51c943fe42f8ffb0489a04f0ce39b0f2339136b98202ac5939d009dc"} Oct 09 19:38:50 crc kubenswrapper[4907]: I1009 19:38:50.698669 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"af58deb929cf4bd35a1ba35d3084e20b8bacc132a0bc4a14cabe7ce1361149d9"} Oct 09 19:38:50 crc kubenswrapper[4907]: I1009 19:38:50.698680 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"487bf965e9613a496626cc10855b20c1a1cd9ec510087269e28e34f6dc00e4c1"} Oct 09 19:38:50 crc kubenswrapper[4907]: I1009 19:38:50.698690 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"880ce601981fdbc65c46c4fb5027dfd3a21928f553e360e4b8f7b5b4dbe11587"} Oct 09 19:38:52 crc kubenswrapper[4907]: I1009 19:38:52.713816 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"b11e9d71c396c3ddb22691777a35317e3ec68112559e9b1f4e203b9e1bb22284"} Oct 09 19:38:54 crc kubenswrapper[4907]: I1009 19:38:54.925610 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8"] Oct 09 19:38:54 crc kubenswrapper[4907]: I1009 19:38:54.926649 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:54 crc kubenswrapper[4907]: I1009 19:38:54.928662 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 09 19:38:54 crc kubenswrapper[4907]: I1009 19:38:54.929163 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 09 19:38:54 crc kubenswrapper[4907]: I1009 19:38:54.929999 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-hhpn2" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.005384 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7znld\" (UniqueName: \"kubernetes.io/projected/df906780-9fa6-4336-8b74-dd4061587bfe-kube-api-access-7znld\") pod \"obo-prometheus-operator-7c8cf85677-vfrf8\" (UID: \"df906780-9fa6-4336-8b74-dd4061587bfe\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.048201 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.048830 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.050886 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.051441 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-p7tkv" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.057973 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.058733 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.106091 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7znld\" (UniqueName: \"kubernetes.io/projected/df906780-9fa6-4336-8b74-dd4061587bfe-kube-api-access-7znld\") pod \"obo-prometheus-operator-7c8cf85677-vfrf8\" (UID: \"df906780-9fa6-4336-8b74-dd4061587bfe\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.127958 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7znld\" (UniqueName: \"kubernetes.io/projected/df906780-9fa6-4336-8b74-dd4061587bfe-kube-api-access-7znld\") pod \"obo-prometheus-operator-7c8cf85677-vfrf8\" (UID: \"df906780-9fa6-4336-8b74-dd4061587bfe\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.207325 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6b76317-83ef-4bab-b4bd-7940ca0c954e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd\" (UID: \"d6b76317-83ef-4bab-b4bd-7940ca0c954e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.207375 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1d14ebb-33ce-4f94-b224-f267661a1704-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz\" (UID: \"d1d14ebb-33ce-4f94-b224-f267661a1704\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.207394 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1d14ebb-33ce-4f94-b224-f267661a1704-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz\" (UID: \"d1d14ebb-33ce-4f94-b224-f267661a1704\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.207432 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6b76317-83ef-4bab-b4bd-7940ca0c954e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd\" (UID: \"d6b76317-83ef-4bab-b4bd-7940ca0c954e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.246284 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.270402 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-qp2p6"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.271233 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.274094 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-vljtw" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.274793 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.292805 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(2e78917c1a38b471840450bdc6a903343a4c7e4afb553a31974b66c4d4978849): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.292864 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(2e78917c1a38b471840450bdc6a903343a4c7e4afb553a31974b66c4d4978849): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.292885 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(2e78917c1a38b471840450bdc6a903343a4c7e4afb553a31974b66c4d4978849): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.292921 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators(df906780-9fa6-4336-8b74-dd4061587bfe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators(df906780-9fa6-4336-8b74-dd4061587bfe)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(2e78917c1a38b471840450bdc6a903343a4c7e4afb553a31974b66c4d4978849): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" podUID="df906780-9fa6-4336-8b74-dd4061587bfe" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.308360 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6b76317-83ef-4bab-b4bd-7940ca0c954e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd\" (UID: \"d6b76317-83ef-4bab-b4bd-7940ca0c954e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.308407 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1d14ebb-33ce-4f94-b224-f267661a1704-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz\" (UID: \"d1d14ebb-33ce-4f94-b224-f267661a1704\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.308436 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1d14ebb-33ce-4f94-b224-f267661a1704-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz\" (UID: \"d1d14ebb-33ce-4f94-b224-f267661a1704\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.308492 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6b76317-83ef-4bab-b4bd-7940ca0c954e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd\" (UID: \"d6b76317-83ef-4bab-b4bd-7940ca0c954e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.312069 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6b76317-83ef-4bab-b4bd-7940ca0c954e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd\" (UID: \"d6b76317-83ef-4bab-b4bd-7940ca0c954e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.312079 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1d14ebb-33ce-4f94-b224-f267661a1704-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz\" (UID: \"d1d14ebb-33ce-4f94-b224-f267661a1704\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.312598 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1d14ebb-33ce-4f94-b224-f267661a1704-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz\" (UID: \"d1d14ebb-33ce-4f94-b224-f267661a1704\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.314002 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6b76317-83ef-4bab-b4bd-7940ca0c954e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd\" (UID: \"d6b76317-83ef-4bab-b4bd-7940ca0c954e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.362854 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.375218 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.382743 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(a62c20b2b4a8728f9ff2c5c43d0aa8a6e7a6049d1f12bf303b6ae900ad02ebdc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.382808 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(a62c20b2b4a8728f9ff2c5c43d0aa8a6e7a6049d1f12bf303b6ae900ad02ebdc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.382836 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(a62c20b2b4a8728f9ff2c5c43d0aa8a6e7a6049d1f12bf303b6ae900ad02ebdc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.382911 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators(d6b76317-83ef-4bab-b4bd-7940ca0c954e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators(d6b76317-83ef-4bab-b4bd-7940ca0c954e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(a62c20b2b4a8728f9ff2c5c43d0aa8a6e7a6049d1f12bf303b6ae900ad02ebdc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" podUID="d6b76317-83ef-4bab-b4bd-7940ca0c954e" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.398430 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(9cf421e75019e10ba056d406ed3a352bfbd2a0dff859073a74a8ae9be7b781c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.398507 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(9cf421e75019e10ba056d406ed3a352bfbd2a0dff859073a74a8ae9be7b781c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.398527 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(9cf421e75019e10ba056d406ed3a352bfbd2a0dff859073a74a8ae9be7b781c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.398568 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators(d1d14ebb-33ce-4f94-b224-f267661a1704)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators(d1d14ebb-33ce-4f94-b224-f267661a1704)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(9cf421e75019e10ba056d406ed3a352bfbd2a0dff859073a74a8ae9be7b781c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" podUID="d1d14ebb-33ce-4f94-b224-f267661a1704" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.409913 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f351fc6-080d-41c9-ab41-44dc032b6579-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-qp2p6\" (UID: \"8f351fc6-080d-41c9-ab41-44dc032b6579\") " pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.410024 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwhfb\" (UniqueName: \"kubernetes.io/projected/8f351fc6-080d-41c9-ab41-44dc032b6579-kube-api-access-fwhfb\") pod \"observability-operator-cc5f78dfc-qp2p6\" (UID: \"8f351fc6-080d-41c9-ab41-44dc032b6579\") " pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.442138 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-csbcg"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.443888 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.447892 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-kpnf6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.513055 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f351fc6-080d-41c9-ab41-44dc032b6579-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-qp2p6\" (UID: \"8f351fc6-080d-41c9-ab41-44dc032b6579\") " pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.513149 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwhfb\" (UniqueName: \"kubernetes.io/projected/8f351fc6-080d-41c9-ab41-44dc032b6579-kube-api-access-fwhfb\") pod \"observability-operator-cc5f78dfc-qp2p6\" (UID: \"8f351fc6-080d-41c9-ab41-44dc032b6579\") " pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.518162 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f351fc6-080d-41c9-ab41-44dc032b6579-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-qp2p6\" (UID: \"8f351fc6-080d-41c9-ab41-44dc032b6579\") " pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.532269 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwhfb\" (UniqueName: \"kubernetes.io/projected/8f351fc6-080d-41c9-ab41-44dc032b6579-kube-api-access-fwhfb\") pod \"observability-operator-cc5f78dfc-qp2p6\" (UID: \"8f351fc6-080d-41c9-ab41-44dc032b6579\") " pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.604726 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.614775 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5sl\" (UniqueName: \"kubernetes.io/projected/2bb9bd82-399c-43cb-aad5-37832f57ba4f-kube-api-access-qc5sl\") pod \"perses-operator-54bc95c9fb-csbcg\" (UID: \"2bb9bd82-399c-43cb-aad5-37832f57ba4f\") " pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.614837 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2bb9bd82-399c-43cb-aad5-37832f57ba4f-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-csbcg\" (UID: \"2bb9bd82-399c-43cb-aad5-37832f57ba4f\") " pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.633437 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(c2d91c7f6543dc7856a9bdbfb3f830ed57a98622b9db762ba6d6d31d8ee467ca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.633519 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(c2d91c7f6543dc7856a9bdbfb3f830ed57a98622b9db762ba6d6d31d8ee467ca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.633539 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(c2d91c7f6543dc7856a9bdbfb3f830ed57a98622b9db762ba6d6d31d8ee467ca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.633588 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-qp2p6_openshift-operators(8f351fc6-080d-41c9-ab41-44dc032b6579)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-qp2p6_openshift-operators(8f351fc6-080d-41c9-ab41-44dc032b6579)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(c2d91c7f6543dc7856a9bdbfb3f830ed57a98622b9db762ba6d6d31d8ee467ca): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" podUID="8f351fc6-080d-41c9-ab41-44dc032b6579" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.716521 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2bb9bd82-399c-43cb-aad5-37832f57ba4f-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-csbcg\" (UID: \"2bb9bd82-399c-43cb-aad5-37832f57ba4f\") " pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.716727 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5sl\" (UniqueName: \"kubernetes.io/projected/2bb9bd82-399c-43cb-aad5-37832f57ba4f-kube-api-access-qc5sl\") pod \"perses-operator-54bc95c9fb-csbcg\" (UID: \"2bb9bd82-399c-43cb-aad5-37832f57ba4f\") " pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.717549 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2bb9bd82-399c-43cb-aad5-37832f57ba4f-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-csbcg\" (UID: \"2bb9bd82-399c-43cb-aad5-37832f57ba4f\") " pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.741582 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" event={"ID":"9719295f-5529-4df0-84a3-7ffdc5dd9be1","Type":"ContainerStarted","Data":"461e09c459a80fc85e40814ad7e6887857b8c9daadc7633d400ebbca91db6d88"} Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.741958 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.742028 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.761290 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5sl\" (UniqueName: \"kubernetes.io/projected/2bb9bd82-399c-43cb-aad5-37832f57ba4f-kube-api-access-qc5sl\") pod \"perses-operator-54bc95c9fb-csbcg\" (UID: \"2bb9bd82-399c-43cb-aad5-37832f57ba4f\") " pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.776450 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.782483 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.793818 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" podStartSLOduration=7.793796222 podStartE2EDuration="7.793796222s" podCreationTimestamp="2025-10-09 19:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:38:55.784907244 +0000 UTC m=+621.316874753" watchObservedRunningTime="2025-10-09 19:38:55.793796222 +0000 UTC m=+621.325763721" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.805154 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(b84837803770a01ee9c51f72f56d585f90f049e7fbf6bedf38ea553aeaeadaf2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.805230 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(b84837803770a01ee9c51f72f56d585f90f049e7fbf6bedf38ea553aeaeadaf2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.805253 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(b84837803770a01ee9c51f72f56d585f90f049e7fbf6bedf38ea553aeaeadaf2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.805299 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-csbcg_openshift-operators(2bb9bd82-399c-43cb-aad5-37832f57ba4f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-csbcg_openshift-operators(2bb9bd82-399c-43cb-aad5-37832f57ba4f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(b84837803770a01ee9c51f72f56d585f90f049e7fbf6bedf38ea553aeaeadaf2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" podUID="2bb9bd82-399c-43cb-aad5-37832f57ba4f" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.855558 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-qp2p6"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.855672 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.856137 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.858453 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-csbcg"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.868806 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.868930 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.869372 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.878589 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(ff27ae36318ef0e6cde568e670622ce2440960757d271a5b666f045bca058e67): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.878648 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(ff27ae36318ef0e6cde568e670622ce2440960757d271a5b666f045bca058e67): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.878670 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(ff27ae36318ef0e6cde568e670622ce2440960757d271a5b666f045bca058e67): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.878717 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-qp2p6_openshift-operators(8f351fc6-080d-41c9-ab41-44dc032b6579)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-qp2p6_openshift-operators(8f351fc6-080d-41c9-ab41-44dc032b6579)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(ff27ae36318ef0e6cde568e670622ce2440960757d271a5b666f045bca058e67): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" podUID="8f351fc6-080d-41c9-ab41-44dc032b6579" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.892295 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.892577 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.893062 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.904221 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(441b27cffee460e9caf972ab004e65f4c6beb3f6ffd5c4c8bf099514119c8ea1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.904305 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(441b27cffee460e9caf972ab004e65f4c6beb3f6ffd5c4c8bf099514119c8ea1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.904335 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(441b27cffee460e9caf972ab004e65f4c6beb3f6ffd5c4c8bf099514119c8ea1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.904401 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators(d6b76317-83ef-4bab-b4bd-7940ca0c954e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators(d6b76317-83ef-4bab-b4bd-7940ca0c954e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(441b27cffee460e9caf972ab004e65f4c6beb3f6ffd5c4c8bf099514119c8ea1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" podUID="d6b76317-83ef-4bab-b4bd-7940ca0c954e" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.916978 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(8eaa5b57de26b9f20ff738c5c80422cd93696139ce7899c2adf40d0f45548c4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.917124 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(8eaa5b57de26b9f20ff738c5c80422cd93696139ce7899c2adf40d0f45548c4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.917209 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(8eaa5b57de26b9f20ff738c5c80422cd93696139ce7899c2adf40d0f45548c4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.917329 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators(d1d14ebb-33ce-4f94-b224-f267661a1704)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators(d1d14ebb-33ce-4f94-b224-f267661a1704)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(8eaa5b57de26b9f20ff738c5c80422cd93696139ce7899c2adf40d0f45548c4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" podUID="d1d14ebb-33ce-4f94-b224-f267661a1704" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.926477 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8"] Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.926586 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: I1009 19:38:55.926970 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.948777 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(b21ef153e629c9fc8ffe4d5b325d43d57408e7176b5e4bd0754e21109925c4e7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.948855 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(b21ef153e629c9fc8ffe4d5b325d43d57408e7176b5e4bd0754e21109925c4e7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.948879 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(b21ef153e629c9fc8ffe4d5b325d43d57408e7176b5e4bd0754e21109925c4e7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:38:55 crc kubenswrapper[4907]: E1009 19:38:55.948924 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators(df906780-9fa6-4336-8b74-dd4061587bfe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators(df906780-9fa6-4336-8b74-dd4061587bfe)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(b21ef153e629c9fc8ffe4d5b325d43d57408e7176b5e4bd0754e21109925c4e7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" podUID="df906780-9fa6-4336-8b74-dd4061587bfe" Oct 09 19:38:56 crc kubenswrapper[4907]: I1009 19:38:56.747596 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:38:56 crc kubenswrapper[4907]: I1009 19:38:56.747614 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:56 crc kubenswrapper[4907]: I1009 19:38:56.748583 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:56 crc kubenswrapper[4907]: E1009 19:38:56.828952 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(289297833b4d77e0d6d9e45ff18a39cfca2ad3a014f194d93400e4cf347a4457): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:38:56 crc kubenswrapper[4907]: E1009 19:38:56.829118 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(289297833b4d77e0d6d9e45ff18a39cfca2ad3a014f194d93400e4cf347a4457): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:56 crc kubenswrapper[4907]: E1009 19:38:56.829211 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(289297833b4d77e0d6d9e45ff18a39cfca2ad3a014f194d93400e4cf347a4457): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:38:56 crc kubenswrapper[4907]: E1009 19:38:56.829344 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-csbcg_openshift-operators(2bb9bd82-399c-43cb-aad5-37832f57ba4f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-csbcg_openshift-operators(2bb9bd82-399c-43cb-aad5-37832f57ba4f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(289297833b4d77e0d6d9e45ff18a39cfca2ad3a014f194d93400e4cf347a4457): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" podUID="2bb9bd82-399c-43cb-aad5-37832f57ba4f" Oct 09 19:38:56 crc kubenswrapper[4907]: I1009 19:38:56.841313 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:39:03 crc kubenswrapper[4907]: I1009 19:39:03.151611 4907 scope.go:117] "RemoveContainer" containerID="9a676382e3b8fb157627fb4d13edff66f1e877f4c38457dd35387965f237f3df" Oct 09 19:39:03 crc kubenswrapper[4907]: E1009 19:39:03.152608 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-hns2h_openshift-multus(64344fcc-f9f2-424f-a32b-44927641b614)\"" pod="openshift-multus/multus-hns2h" podUID="64344fcc-f9f2-424f-a32b-44927641b614" Oct 09 19:39:07 crc kubenswrapper[4907]: I1009 19:39:07.150800 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:39:07 crc kubenswrapper[4907]: I1009 19:39:07.151539 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:39:07 crc kubenswrapper[4907]: E1009 19:39:07.204096 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(d9c10f67eefaf20102437401c0747eae1fdc2c7b3f9cf55c6fa122512278d95f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:39:07 crc kubenswrapper[4907]: E1009 19:39:07.204162 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(d9c10f67eefaf20102437401c0747eae1fdc2c7b3f9cf55c6fa122512278d95f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:39:07 crc kubenswrapper[4907]: E1009 19:39:07.204184 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(d9c10f67eefaf20102437401c0747eae1fdc2c7b3f9cf55c6fa122512278d95f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:39:07 crc kubenswrapper[4907]: E1009 19:39:07.204226 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators(d1d14ebb-33ce-4f94-b224-f267661a1704)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators(d1d14ebb-33ce-4f94-b224-f267661a1704)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_openshift-operators_d1d14ebb-33ce-4f94-b224-f267661a1704_0(d9c10f67eefaf20102437401c0747eae1fdc2c7b3f9cf55c6fa122512278d95f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" podUID="d1d14ebb-33ce-4f94-b224-f267661a1704" Oct 09 19:39:08 crc kubenswrapper[4907]: I1009 19:39:08.150888 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:08 crc kubenswrapper[4907]: I1009 19:39:08.150927 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:39:08 crc kubenswrapper[4907]: I1009 19:39:08.151718 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:08 crc kubenswrapper[4907]: I1009 19:39:08.151860 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:39:08 crc kubenswrapper[4907]: E1009 19:39:08.199654 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(e4383042ed4cddc8f0c85c5b5a44c47e6157ca58b7e9090f8fbb8a866b5b1b5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:39:08 crc kubenswrapper[4907]: E1009 19:39:08.199739 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(e4383042ed4cddc8f0c85c5b5a44c47e6157ca58b7e9090f8fbb8a866b5b1b5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:08 crc kubenswrapper[4907]: E1009 19:39:08.199767 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(e4383042ed4cddc8f0c85c5b5a44c47e6157ca58b7e9090f8fbb8a866b5b1b5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:08 crc kubenswrapper[4907]: E1009 19:39:08.199817 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-qp2p6_openshift-operators(8f351fc6-080d-41c9-ab41-44dc032b6579)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-qp2p6_openshift-operators(8f351fc6-080d-41c9-ab41-44dc032b6579)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(e4383042ed4cddc8f0c85c5b5a44c47e6157ca58b7e9090f8fbb8a866b5b1b5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" podUID="8f351fc6-080d-41c9-ab41-44dc032b6579" Oct 09 19:39:08 crc kubenswrapper[4907]: E1009 19:39:08.202419 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(bd8945bbe2d002d88f2d159a430d026efbdebc3beb0451ce1a8c6a2b33ec7b2d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:39:08 crc kubenswrapper[4907]: E1009 19:39:08.202492 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(bd8945bbe2d002d88f2d159a430d026efbdebc3beb0451ce1a8c6a2b33ec7b2d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:39:08 crc kubenswrapper[4907]: E1009 19:39:08.202518 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(bd8945bbe2d002d88f2d159a430d026efbdebc3beb0451ce1a8c6a2b33ec7b2d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:39:08 crc kubenswrapper[4907]: E1009 19:39:08.202564 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators(d6b76317-83ef-4bab-b4bd-7940ca0c954e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators(d6b76317-83ef-4bab-b4bd-7940ca0c954e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_openshift-operators_d6b76317-83ef-4bab-b4bd-7940ca0c954e_0(bd8945bbe2d002d88f2d159a430d026efbdebc3beb0451ce1a8c6a2b33ec7b2d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" podUID="d6b76317-83ef-4bab-b4bd-7940ca0c954e" Oct 09 19:39:09 crc kubenswrapper[4907]: I1009 19:39:09.151196 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:09 crc kubenswrapper[4907]: I1009 19:39:09.151676 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:09 crc kubenswrapper[4907]: E1009 19:39:09.200199 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(32b91a6582040791e8a5e2f904c235500b782a9597a8e916b518885b7758261f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:39:09 crc kubenswrapper[4907]: E1009 19:39:09.200342 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(32b91a6582040791e8a5e2f904c235500b782a9597a8e916b518885b7758261f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:09 crc kubenswrapper[4907]: E1009 19:39:09.200374 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(32b91a6582040791e8a5e2f904c235500b782a9597a8e916b518885b7758261f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:09 crc kubenswrapper[4907]: E1009 19:39:09.200428 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators(df906780-9fa6-4336-8b74-dd4061587bfe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators(df906780-9fa6-4336-8b74-dd4061587bfe)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(32b91a6582040791e8a5e2f904c235500b782a9597a8e916b518885b7758261f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" podUID="df906780-9fa6-4336-8b74-dd4061587bfe" Oct 09 19:39:12 crc kubenswrapper[4907]: I1009 19:39:12.150902 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:39:12 crc kubenswrapper[4907]: I1009 19:39:12.152018 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:39:12 crc kubenswrapper[4907]: E1009 19:39:12.182620 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(6dcc0df12b1200c7ef8d8e928b771996f1087f59841baa4b058f2e1443a3945e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:39:12 crc kubenswrapper[4907]: E1009 19:39:12.182691 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(6dcc0df12b1200c7ef8d8e928b771996f1087f59841baa4b058f2e1443a3945e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:39:12 crc kubenswrapper[4907]: E1009 19:39:12.182719 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(6dcc0df12b1200c7ef8d8e928b771996f1087f59841baa4b058f2e1443a3945e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:39:12 crc kubenswrapper[4907]: E1009 19:39:12.182771 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54bc95c9fb-csbcg_openshift-operators(2bb9bd82-399c-43cb-aad5-37832f57ba4f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54bc95c9fb-csbcg_openshift-operators(2bb9bd82-399c-43cb-aad5-37832f57ba4f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54bc95c9fb-csbcg_openshift-operators_2bb9bd82-399c-43cb-aad5-37832f57ba4f_0(6dcc0df12b1200c7ef8d8e928b771996f1087f59841baa4b058f2e1443a3945e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" podUID="2bb9bd82-399c-43cb-aad5-37832f57ba4f" Oct 09 19:39:18 crc kubenswrapper[4907]: I1009 19:39:18.847953 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rp7n" Oct 09 19:39:19 crc kubenswrapper[4907]: I1009 19:39:19.151336 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:19 crc kubenswrapper[4907]: I1009 19:39:19.151764 4907 scope.go:117] "RemoveContainer" containerID="9a676382e3b8fb157627fb4d13edff66f1e877f4c38457dd35387965f237f3df" Oct 09 19:39:19 crc kubenswrapper[4907]: I1009 19:39:19.151821 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:19 crc kubenswrapper[4907]: E1009 19:39:19.201748 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(9943aa0bb913850b7b7253d9582d4c941dc953269c2d0ae527dd9dbbd6184223): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:39:19 crc kubenswrapper[4907]: E1009 19:39:19.201838 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(9943aa0bb913850b7b7253d9582d4c941dc953269c2d0ae527dd9dbbd6184223): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:19 crc kubenswrapper[4907]: E1009 19:39:19.201868 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(9943aa0bb913850b7b7253d9582d4c941dc953269c2d0ae527dd9dbbd6184223): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:19 crc kubenswrapper[4907]: E1009 19:39:19.201927 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-cc5f78dfc-qp2p6_openshift-operators(8f351fc6-080d-41c9-ab41-44dc032b6579)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-cc5f78dfc-qp2p6_openshift-operators(8f351fc6-080d-41c9-ab41-44dc032b6579)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-cc5f78dfc-qp2p6_openshift-operators_8f351fc6-080d-41c9-ab41-44dc032b6579_0(9943aa0bb913850b7b7253d9582d4c941dc953269c2d0ae527dd9dbbd6184223): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" podUID="8f351fc6-080d-41c9-ab41-44dc032b6579" Oct 09 19:39:19 crc kubenswrapper[4907]: I1009 19:39:19.867698 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/2.log" Oct 09 19:39:19 crc kubenswrapper[4907]: I1009 19:39:19.868826 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/1.log" Oct 09 19:39:19 crc kubenswrapper[4907]: I1009 19:39:19.868898 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hns2h" event={"ID":"64344fcc-f9f2-424f-a32b-44927641b614","Type":"ContainerStarted","Data":"e188cefc59d3b7834169717e90b9de8e2f93ca2876de78517f81b9ed9e6586ae"} Oct 09 19:39:20 crc kubenswrapper[4907]: I1009 19:39:20.150806 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:20 crc kubenswrapper[4907]: I1009 19:39:20.151777 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:20 crc kubenswrapper[4907]: E1009 19:39:20.175680 4907 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(1c5920cf6209695c0316986450a95b52d4317866c4190c34db87536d4a34cc70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 19:39:20 crc kubenswrapper[4907]: E1009 19:39:20.175747 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(1c5920cf6209695c0316986450a95b52d4317866c4190c34db87536d4a34cc70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:20 crc kubenswrapper[4907]: E1009 19:39:20.175771 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(1c5920cf6209695c0316986450a95b52d4317866c4190c34db87536d4a34cc70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:20 crc kubenswrapper[4907]: E1009 19:39:20.175828 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators(df906780-9fa6-4336-8b74-dd4061587bfe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators(df906780-9fa6-4336-8b74-dd4061587bfe)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-7c8cf85677-vfrf8_openshift-operators_df906780-9fa6-4336-8b74-dd4061587bfe_0(1c5920cf6209695c0316986450a95b52d4317866c4190c34db87536d4a34cc70): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" podUID="df906780-9fa6-4336-8b74-dd4061587bfe" Oct 09 19:39:22 crc kubenswrapper[4907]: I1009 19:39:22.150894 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:39:22 crc kubenswrapper[4907]: I1009 19:39:22.152162 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" Oct 09 19:39:22 crc kubenswrapper[4907]: I1009 19:39:22.631031 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz"] Oct 09 19:39:22 crc kubenswrapper[4907]: I1009 19:39:22.887804 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" event={"ID":"d1d14ebb-33ce-4f94-b224-f267661a1704","Type":"ContainerStarted","Data":"815877fd4884190ce2252c690de1a7c6c386941319b44cd6e826506650cc37dc"} Oct 09 19:39:23 crc kubenswrapper[4907]: I1009 19:39:23.150745 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:39:23 crc kubenswrapper[4907]: I1009 19:39:23.150770 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:39:23 crc kubenswrapper[4907]: I1009 19:39:23.151358 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:39:23 crc kubenswrapper[4907]: I1009 19:39:23.151430 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" Oct 09 19:39:23 crc kubenswrapper[4907]: I1009 19:39:23.459891 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd"] Oct 09 19:39:23 crc kubenswrapper[4907]: I1009 19:39:23.526292 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-csbcg"] Oct 09 19:39:23 crc kubenswrapper[4907]: I1009 19:39:23.913067 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" event={"ID":"d6b76317-83ef-4bab-b4bd-7940ca0c954e","Type":"ContainerStarted","Data":"b38bb0a7e7ad76a758f032313d51395faca295b06dd0da284c0a086c9af51827"} Oct 09 19:39:23 crc kubenswrapper[4907]: I1009 19:39:23.914517 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" event={"ID":"2bb9bd82-399c-43cb-aad5-37832f57ba4f","Type":"ContainerStarted","Data":"327b8da4e675f12dfa4ff96811ddb442c9a7aa984feefe7bb9292fc1ced7515f"} Oct 09 19:39:29 crc kubenswrapper[4907]: I1009 19:39:29.950685 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" event={"ID":"d6b76317-83ef-4bab-b4bd-7940ca0c954e","Type":"ContainerStarted","Data":"365cd6ed1bcc2de1acb7ded05980ddea7cc5b07808f2dba309c0d7b526464e0f"} Oct 09 19:39:29 crc kubenswrapper[4907]: I1009 19:39:29.953815 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" event={"ID":"d1d14ebb-33ce-4f94-b224-f267661a1704","Type":"ContainerStarted","Data":"ec39edf95a2390fd31a7b5feed2dc50c1fc59f0d2c87ca178de80e42b5267ffc"} Oct 09 19:39:29 crc kubenswrapper[4907]: I1009 19:39:29.956109 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" event={"ID":"2bb9bd82-399c-43cb-aad5-37832f57ba4f","Type":"ContainerStarted","Data":"3c2fc7d542b397f62e0b3d595106bea32866bcccd85ebd84d173237d8fd1a353"} Oct 09 19:39:29 crc kubenswrapper[4907]: I1009 19:39:29.956296 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:39:29 crc kubenswrapper[4907]: I1009 19:39:29.983931 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd" podStartSLOduration=29.156702482 podStartE2EDuration="34.983906626s" podCreationTimestamp="2025-10-09 19:38:55 +0000 UTC" firstStartedPulling="2025-10-09 19:39:23.47669566 +0000 UTC m=+649.008663149" lastFinishedPulling="2025-10-09 19:39:29.303899804 +0000 UTC m=+654.835867293" observedRunningTime="2025-10-09 19:39:29.981005191 +0000 UTC m=+655.512972690" watchObservedRunningTime="2025-10-09 19:39:29.983906626 +0000 UTC m=+655.515874145" Oct 09 19:39:30 crc kubenswrapper[4907]: I1009 19:39:30.008836 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz" podStartSLOduration=28.370760022 podStartE2EDuration="35.008816404s" podCreationTimestamp="2025-10-09 19:38:55 +0000 UTC" firstStartedPulling="2025-10-09 19:39:22.64428585 +0000 UTC m=+648.176253339" lastFinishedPulling="2025-10-09 19:39:29.282342222 +0000 UTC m=+654.814309721" observedRunningTime="2025-10-09 19:39:30.004793601 +0000 UTC m=+655.536761100" watchObservedRunningTime="2025-10-09 19:39:30.008816404 +0000 UTC m=+655.540783913" Oct 09 19:39:30 crc kubenswrapper[4907]: I1009 19:39:30.038520 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" podStartSLOduration=29.238787396 podStartE2EDuration="35.038492505s" podCreationTimestamp="2025-10-09 19:38:55 +0000 UTC" firstStartedPulling="2025-10-09 19:39:23.532164042 +0000 UTC m=+649.064131541" lastFinishedPulling="2025-10-09 19:39:29.331869161 +0000 UTC m=+654.863836650" observedRunningTime="2025-10-09 19:39:30.032324287 +0000 UTC m=+655.564291806" watchObservedRunningTime="2025-10-09 19:39:30.038492505 +0000 UTC m=+655.570460004" Oct 09 19:39:33 crc kubenswrapper[4907]: I1009 19:39:33.151313 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:33 crc kubenswrapper[4907]: I1009 19:39:33.152240 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" Oct 09 19:39:33 crc kubenswrapper[4907]: I1009 19:39:33.369153 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8"] Oct 09 19:39:33 crc kubenswrapper[4907]: W1009 19:39:33.381515 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf906780_9fa6_4336_8b74_dd4061587bfe.slice/crio-184cb6e5eda95705f2a42fa8bea82cc62ab91dfc9e444c2bb3d1e9f760d89de2 WatchSource:0}: Error finding container 184cb6e5eda95705f2a42fa8bea82cc62ab91dfc9e444c2bb3d1e9f760d89de2: Status 404 returned error can't find the container with id 184cb6e5eda95705f2a42fa8bea82cc62ab91dfc9e444c2bb3d1e9f760d89de2 Oct 09 19:39:33 crc kubenswrapper[4907]: I1009 19:39:33.983045 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" event={"ID":"df906780-9fa6-4336-8b74-dd4061587bfe","Type":"ContainerStarted","Data":"184cb6e5eda95705f2a42fa8bea82cc62ab91dfc9e444c2bb3d1e9f760d89de2"} Oct 09 19:39:34 crc kubenswrapper[4907]: I1009 19:39:34.150762 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:34 crc kubenswrapper[4907]: I1009 19:39:34.151802 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:34 crc kubenswrapper[4907]: I1009 19:39:34.434036 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-qp2p6"] Oct 09 19:39:34 crc kubenswrapper[4907]: I1009 19:39:34.989279 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" event={"ID":"8f351fc6-080d-41c9-ab41-44dc032b6579","Type":"ContainerStarted","Data":"11805f381e88b38fce5a76237c48753f76ea4da3c68aeac6e6544846f35420d0"} Oct 09 19:39:35 crc kubenswrapper[4907]: I1009 19:39:35.367244 4907 scope.go:117] "RemoveContainer" containerID="40f1e98828509239898d27515268749bcc89081dd001d0fafdc18d3013407d0d" Oct 09 19:39:35 crc kubenswrapper[4907]: I1009 19:39:35.798962 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-csbcg" Oct 09 19:39:35 crc kubenswrapper[4907]: I1009 19:39:35.997262 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hns2h_64344fcc-f9f2-424f-a32b-44927641b614/kube-multus/2.log" Oct 09 19:39:38 crc kubenswrapper[4907]: I1009 19:39:38.011144 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" event={"ID":"df906780-9fa6-4336-8b74-dd4061587bfe","Type":"ContainerStarted","Data":"1a02aefaeb8ded45e8e9d76ee49e99fa1f8204c8b35a17839b478b3f63d9cb5a"} Oct 09 19:39:38 crc kubenswrapper[4907]: I1009 19:39:38.037078 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vfrf8" podStartSLOduration=41.711509037 podStartE2EDuration="44.037058424s" podCreationTimestamp="2025-10-09 19:38:54 +0000 UTC" firstStartedPulling="2025-10-09 19:39:33.383590499 +0000 UTC m=+658.915557998" lastFinishedPulling="2025-10-09 19:39:35.709139896 +0000 UTC m=+661.241107385" observedRunningTime="2025-10-09 19:39:38.033688227 +0000 UTC m=+663.565655736" watchObservedRunningTime="2025-10-09 19:39:38.037058424 +0000 UTC m=+663.569025933" Oct 09 19:39:40 crc kubenswrapper[4907]: I1009 19:39:40.029672 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" event={"ID":"8f351fc6-080d-41c9-ab41-44dc032b6579","Type":"ContainerStarted","Data":"81854b307cd84add25c6f9769e102826cd35e5cd78228109654b766642cf39ea"} Oct 09 19:39:40 crc kubenswrapper[4907]: I1009 19:39:40.030053 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:40 crc kubenswrapper[4907]: I1009 19:39:40.063297 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" podStartSLOduration=40.388511386 podStartE2EDuration="45.063253987s" podCreationTimestamp="2025-10-09 19:38:55 +0000 UTC" firstStartedPulling="2025-10-09 19:39:34.444617489 +0000 UTC m=+659.976584988" lastFinishedPulling="2025-10-09 19:39:39.1193601 +0000 UTC m=+664.651327589" observedRunningTime="2025-10-09 19:39:40.062067687 +0000 UTC m=+665.594035236" watchObservedRunningTime="2025-10-09 19:39:40.063253987 +0000 UTC m=+665.595221496" Oct 09 19:39:40 crc kubenswrapper[4907]: I1009 19:39:40.086343 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-qp2p6" Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.967072 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-jxzvd"] Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.968655 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-jxzvd" Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.971360 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-ldr2t" Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.971560 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.973582 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.987221 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-jxzvd"] Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.989719 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-gp6g8"] Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.990375 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-gp6g8" Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.996269 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-x47d6" Oct 09 19:39:49 crc kubenswrapper[4907]: I1009 19:39:49.997623 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-gp6g8"] Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.010026 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-kmxnd"] Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.011137 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.013952 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-kmxnd"] Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.015487 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nf9dn" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.071109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4kvg\" (UniqueName: \"kubernetes.io/projected/023075d5-e7bd-49f9-876a-d728fa5d66ce-kube-api-access-d4kvg\") pod \"cert-manager-cainjector-7f985d654d-jxzvd\" (UID: \"023075d5-e7bd-49f9-876a-d728fa5d66ce\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-jxzvd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.071180 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc25h\" (UniqueName: \"kubernetes.io/projected/2fed30c7-d2fb-4a70-ae72-6dd33133aa94-kube-api-access-wc25h\") pod \"cert-manager-webhook-5655c58dd6-kmxnd\" (UID: \"2fed30c7-d2fb-4a70-ae72-6dd33133aa94\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.071286 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qd2x\" (UniqueName: \"kubernetes.io/projected/9000b916-e219-410f-8e0c-29d959f4527b-kube-api-access-7qd2x\") pod \"cert-manager-5b446d88c5-gp6g8\" (UID: \"9000b916-e219-410f-8e0c-29d959f4527b\") " pod="cert-manager/cert-manager-5b446d88c5-gp6g8" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.172018 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qd2x\" (UniqueName: \"kubernetes.io/projected/9000b916-e219-410f-8e0c-29d959f4527b-kube-api-access-7qd2x\") pod \"cert-manager-5b446d88c5-gp6g8\" (UID: \"9000b916-e219-410f-8e0c-29d959f4527b\") " pod="cert-manager/cert-manager-5b446d88c5-gp6g8" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.172094 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4kvg\" (UniqueName: \"kubernetes.io/projected/023075d5-e7bd-49f9-876a-d728fa5d66ce-kube-api-access-d4kvg\") pod \"cert-manager-cainjector-7f985d654d-jxzvd\" (UID: \"023075d5-e7bd-49f9-876a-d728fa5d66ce\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-jxzvd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.172134 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc25h\" (UniqueName: \"kubernetes.io/projected/2fed30c7-d2fb-4a70-ae72-6dd33133aa94-kube-api-access-wc25h\") pod \"cert-manager-webhook-5655c58dd6-kmxnd\" (UID: \"2fed30c7-d2fb-4a70-ae72-6dd33133aa94\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.199327 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qd2x\" (UniqueName: \"kubernetes.io/projected/9000b916-e219-410f-8e0c-29d959f4527b-kube-api-access-7qd2x\") pod \"cert-manager-5b446d88c5-gp6g8\" (UID: \"9000b916-e219-410f-8e0c-29d959f4527b\") " pod="cert-manager/cert-manager-5b446d88c5-gp6g8" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.201418 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4kvg\" (UniqueName: \"kubernetes.io/projected/023075d5-e7bd-49f9-876a-d728fa5d66ce-kube-api-access-d4kvg\") pod \"cert-manager-cainjector-7f985d654d-jxzvd\" (UID: \"023075d5-e7bd-49f9-876a-d728fa5d66ce\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-jxzvd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.202186 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc25h\" (UniqueName: \"kubernetes.io/projected/2fed30c7-d2fb-4a70-ae72-6dd33133aa94-kube-api-access-wc25h\") pod \"cert-manager-webhook-5655c58dd6-kmxnd\" (UID: \"2fed30c7-d2fb-4a70-ae72-6dd33133aa94\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.283816 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-jxzvd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.303608 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-gp6g8" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.329460 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.516654 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-jxzvd"] Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.605943 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-kmxnd"] Oct 09 19:39:50 crc kubenswrapper[4907]: I1009 19:39:50.754429 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-gp6g8"] Oct 09 19:39:50 crc kubenswrapper[4907]: W1009 19:39:50.754912 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9000b916_e219_410f_8e0c_29d959f4527b.slice/crio-310c115b7595cf4959814049ae24b017d38d32b02e14eac286a44a072451a13c WatchSource:0}: Error finding container 310c115b7595cf4959814049ae24b017d38d32b02e14eac286a44a072451a13c: Status 404 returned error can't find the container with id 310c115b7595cf4959814049ae24b017d38d32b02e14eac286a44a072451a13c Oct 09 19:39:51 crc kubenswrapper[4907]: I1009 19:39:51.114882 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" event={"ID":"2fed30c7-d2fb-4a70-ae72-6dd33133aa94","Type":"ContainerStarted","Data":"b547d5b0f96f2e846312020ffde80112263d9df8e54179752de6efa21524a9f9"} Oct 09 19:39:51 crc kubenswrapper[4907]: I1009 19:39:51.117787 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-jxzvd" event={"ID":"023075d5-e7bd-49f9-876a-d728fa5d66ce","Type":"ContainerStarted","Data":"c3a618f162dc4578e4f37e9d1787408d4442e9ee9a612c266b37473b39fef286"} Oct 09 19:39:51 crc kubenswrapper[4907]: I1009 19:39:51.124741 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-gp6g8" event={"ID":"9000b916-e219-410f-8e0c-29d959f4527b","Type":"ContainerStarted","Data":"310c115b7595cf4959814049ae24b017d38d32b02e14eac286a44a072451a13c"} Oct 09 19:39:53 crc kubenswrapper[4907]: I1009 19:39:53.158120 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-jxzvd" event={"ID":"023075d5-e7bd-49f9-876a-d728fa5d66ce","Type":"ContainerStarted","Data":"8f78ae21f7281c2b040dd0e04644f31717558349b33a9bf05de295be7f43834b"} Oct 09 19:39:53 crc kubenswrapper[4907]: I1009 19:39:53.168397 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-jxzvd" podStartSLOduration=2.158296716 podStartE2EDuration="4.168381375s" podCreationTimestamp="2025-10-09 19:39:49 +0000 UTC" firstStartedPulling="2025-10-09 19:39:50.527731091 +0000 UTC m=+676.059698580" lastFinishedPulling="2025-10-09 19:39:52.53781575 +0000 UTC m=+678.069783239" observedRunningTime="2025-10-09 19:39:53.167769159 +0000 UTC m=+678.699736668" watchObservedRunningTime="2025-10-09 19:39:53.168381375 +0000 UTC m=+678.700348864" Oct 09 19:39:55 crc kubenswrapper[4907]: I1009 19:39:55.181876 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-gp6g8" event={"ID":"9000b916-e219-410f-8e0c-29d959f4527b","Type":"ContainerStarted","Data":"cbf2b6c5df0ad90413027618f0e3d6a4abe64b20dee1cc2ba6636d8e841a2b1c"} Oct 09 19:39:55 crc kubenswrapper[4907]: I1009 19:39:55.193841 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" event={"ID":"2fed30c7-d2fb-4a70-ae72-6dd33133aa94","Type":"ContainerStarted","Data":"20d6fe33620676ead28bc60e4e2478f5524e37cb781dd453c1dd758af4a24030"} Oct 09 19:39:55 crc kubenswrapper[4907]: I1009 19:39:55.194005 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" Oct 09 19:39:55 crc kubenswrapper[4907]: I1009 19:39:55.236254 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" podStartSLOduration=2.443585788 podStartE2EDuration="6.236233085s" podCreationTimestamp="2025-10-09 19:39:49 +0000 UTC" firstStartedPulling="2025-10-09 19:39:50.610320318 +0000 UTC m=+676.142287807" lastFinishedPulling="2025-10-09 19:39:54.402967615 +0000 UTC m=+679.934935104" observedRunningTime="2025-10-09 19:39:55.234422529 +0000 UTC m=+680.766390118" watchObservedRunningTime="2025-10-09 19:39:55.236233085 +0000 UTC m=+680.768200584" Oct 09 19:39:55 crc kubenswrapper[4907]: I1009 19:39:55.265719 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-gp6g8" podStartSLOduration=2.622243248 podStartE2EDuration="6.26569764s" podCreationTimestamp="2025-10-09 19:39:49 +0000 UTC" firstStartedPulling="2025-10-09 19:39:50.756757592 +0000 UTC m=+676.288725081" lastFinishedPulling="2025-10-09 19:39:54.400211974 +0000 UTC m=+679.932179473" observedRunningTime="2025-10-09 19:39:55.264178242 +0000 UTC m=+680.796145761" watchObservedRunningTime="2025-10-09 19:39:55.26569764 +0000 UTC m=+680.797665139" Oct 09 19:40:00 crc kubenswrapper[4907]: I1009 19:40:00.333276 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-kmxnd" Oct 09 19:40:06 crc kubenswrapper[4907]: I1009 19:40:06.299263 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:40:06 crc kubenswrapper[4907]: I1009 19:40:06.300190 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:40:36 crc kubenswrapper[4907]: I1009 19:40:36.299409 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:40:36 crc kubenswrapper[4907]: I1009 19:40:36.299976 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:40:38 crc kubenswrapper[4907]: I1009 19:40:38.793163 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh"] Oct 09 19:40:38 crc kubenswrapper[4907]: I1009 19:40:38.794225 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:38 crc kubenswrapper[4907]: I1009 19:40:38.795769 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 19:40:38 crc kubenswrapper[4907]: I1009 19:40:38.803564 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh"] Oct 09 19:40:38 crc kubenswrapper[4907]: I1009 19:40:38.933722 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8trf\" (UniqueName: \"kubernetes.io/projected/f26ae800-7648-48e3-a47c-ec626aead3dc-kube-api-access-v8trf\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:38 crc kubenswrapper[4907]: I1009 19:40:38.933774 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:38 crc kubenswrapper[4907]: I1009 19:40:38.933924 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:39 crc kubenswrapper[4907]: I1009 19:40:39.035067 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:39 crc kubenswrapper[4907]: I1009 19:40:39.035213 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8trf\" (UniqueName: \"kubernetes.io/projected/f26ae800-7648-48e3-a47c-ec626aead3dc-kube-api-access-v8trf\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:39 crc kubenswrapper[4907]: I1009 19:40:39.035273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:39 crc kubenswrapper[4907]: I1009 19:40:39.036280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:39 crc kubenswrapper[4907]: I1009 19:40:39.036519 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:39 crc kubenswrapper[4907]: I1009 19:40:39.053565 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8trf\" (UniqueName: \"kubernetes.io/projected/f26ae800-7648-48e3-a47c-ec626aead3dc-kube-api-access-v8trf\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:39 crc kubenswrapper[4907]: I1009 19:40:39.109866 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:39 crc kubenswrapper[4907]: I1009 19:40:39.550501 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh"] Oct 09 19:40:39 crc kubenswrapper[4907]: W1009 19:40:39.561662 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26ae800_7648_48e3_a47c_ec626aead3dc.slice/crio-93927ae2be66868047a7e455602b221258b95be22d38a63e98c808a80853b6a1 WatchSource:0}: Error finding container 93927ae2be66868047a7e455602b221258b95be22d38a63e98c808a80853b6a1: Status 404 returned error can't find the container with id 93927ae2be66868047a7e455602b221258b95be22d38a63e98c808a80853b6a1 Oct 09 19:40:40 crc kubenswrapper[4907]: I1009 19:40:40.496577 4907 generic.go:334] "Generic (PLEG): container finished" podID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerID="b6b03fd04a1162b83ad4ed536593cb2a2916adff6ccf5287a0eaee10cab029dc" exitCode=0 Oct 09 19:40:40 crc kubenswrapper[4907]: I1009 19:40:40.496618 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" event={"ID":"f26ae800-7648-48e3-a47c-ec626aead3dc","Type":"ContainerDied","Data":"b6b03fd04a1162b83ad4ed536593cb2a2916adff6ccf5287a0eaee10cab029dc"} Oct 09 19:40:40 crc kubenswrapper[4907]: I1009 19:40:40.496664 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" event={"ID":"f26ae800-7648-48e3-a47c-ec626aead3dc","Type":"ContainerStarted","Data":"93927ae2be66868047a7e455602b221258b95be22d38a63e98c808a80853b6a1"} Oct 09 19:40:42 crc kubenswrapper[4907]: I1009 19:40:42.510504 4907 generic.go:334] "Generic (PLEG): container finished" podID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerID="3f0f1e0e2b66ce443863a2d42c93c3b954b436cece84ff77ac913a901956a862" exitCode=0 Oct 09 19:40:42 crc kubenswrapper[4907]: I1009 19:40:42.510593 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" event={"ID":"f26ae800-7648-48e3-a47c-ec626aead3dc","Type":"ContainerDied","Data":"3f0f1e0e2b66ce443863a2d42c93c3b954b436cece84ff77ac913a901956a862"} Oct 09 19:40:43 crc kubenswrapper[4907]: I1009 19:40:43.520151 4907 generic.go:334] "Generic (PLEG): container finished" podID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerID="1d982daac598a3649968b0646702a61cabd2fdf3cfca1a4dae619aff8d696023" exitCode=0 Oct 09 19:40:43 crc kubenswrapper[4907]: I1009 19:40:43.520251 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" event={"ID":"f26ae800-7648-48e3-a47c-ec626aead3dc","Type":"ContainerDied","Data":"1d982daac598a3649968b0646702a61cabd2fdf3cfca1a4dae619aff8d696023"} Oct 09 19:40:44 crc kubenswrapper[4907]: I1009 19:40:44.848863 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:44 crc kubenswrapper[4907]: I1009 19:40:44.914701 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-util\") pod \"f26ae800-7648-48e3-a47c-ec626aead3dc\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " Oct 09 19:40:44 crc kubenswrapper[4907]: I1009 19:40:44.914862 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8trf\" (UniqueName: \"kubernetes.io/projected/f26ae800-7648-48e3-a47c-ec626aead3dc-kube-api-access-v8trf\") pod \"f26ae800-7648-48e3-a47c-ec626aead3dc\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " Oct 09 19:40:44 crc kubenswrapper[4907]: I1009 19:40:44.914903 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-bundle\") pod \"f26ae800-7648-48e3-a47c-ec626aead3dc\" (UID: \"f26ae800-7648-48e3-a47c-ec626aead3dc\") " Oct 09 19:40:44 crc kubenswrapper[4907]: I1009 19:40:44.916061 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-bundle" (OuterVolumeSpecName: "bundle") pod "f26ae800-7648-48e3-a47c-ec626aead3dc" (UID: "f26ae800-7648-48e3-a47c-ec626aead3dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:40:44 crc kubenswrapper[4907]: I1009 19:40:44.922415 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26ae800-7648-48e3-a47c-ec626aead3dc-kube-api-access-v8trf" (OuterVolumeSpecName: "kube-api-access-v8trf") pod "f26ae800-7648-48e3-a47c-ec626aead3dc" (UID: "f26ae800-7648-48e3-a47c-ec626aead3dc"). InnerVolumeSpecName "kube-api-access-v8trf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:40:44 crc kubenswrapper[4907]: I1009 19:40:44.930773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-util" (OuterVolumeSpecName: "util") pod "f26ae800-7648-48e3-a47c-ec626aead3dc" (UID: "f26ae800-7648-48e3-a47c-ec626aead3dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:40:45 crc kubenswrapper[4907]: I1009 19:40:45.016984 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:40:45 crc kubenswrapper[4907]: I1009 19:40:45.017050 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26ae800-7648-48e3-a47c-ec626aead3dc-util\") on node \"crc\" DevicePath \"\"" Oct 09 19:40:45 crc kubenswrapper[4907]: I1009 19:40:45.017073 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8trf\" (UniqueName: \"kubernetes.io/projected/f26ae800-7648-48e3-a47c-ec626aead3dc-kube-api-access-v8trf\") on node \"crc\" DevicePath \"\"" Oct 09 19:40:45 crc kubenswrapper[4907]: I1009 19:40:45.535261 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" event={"ID":"f26ae800-7648-48e3-a47c-ec626aead3dc","Type":"ContainerDied","Data":"93927ae2be66868047a7e455602b221258b95be22d38a63e98c808a80853b6a1"} Oct 09 19:40:45 crc kubenswrapper[4907]: I1009 19:40:45.535312 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93927ae2be66868047a7e455602b221258b95be22d38a63e98c808a80853b6a1" Oct 09 19:40:45 crc kubenswrapper[4907]: I1009 19:40:45.535398 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.622146 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb"] Oct 09 19:40:47 crc kubenswrapper[4907]: E1009 19:40:47.622734 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerName="extract" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.622750 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerName="extract" Oct 09 19:40:47 crc kubenswrapper[4907]: E1009 19:40:47.622766 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerName="util" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.622775 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerName="util" Oct 09 19:40:47 crc kubenswrapper[4907]: E1009 19:40:47.622786 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerName="pull" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.622795 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerName="pull" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.622943 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26ae800-7648-48e3-a47c-ec626aead3dc" containerName="extract" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.623406 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.625230 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.625541 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.629063 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-z625z" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.642235 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb"] Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.751038 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzsg\" (UniqueName: \"kubernetes.io/projected/9c9331a0-3676-4106-957f-5699d256f0d6-kube-api-access-bzzsg\") pod \"nmstate-operator-858ddd8f98-2g9nb\" (UID: \"9c9331a0-3676-4106-957f-5699d256f0d6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.852395 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzsg\" (UniqueName: \"kubernetes.io/projected/9c9331a0-3676-4106-957f-5699d256f0d6-kube-api-access-bzzsg\") pod \"nmstate-operator-858ddd8f98-2g9nb\" (UID: \"9c9331a0-3676-4106-957f-5699d256f0d6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.873939 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzsg\" (UniqueName: \"kubernetes.io/projected/9c9331a0-3676-4106-957f-5699d256f0d6-kube-api-access-bzzsg\") pod \"nmstate-operator-858ddd8f98-2g9nb\" (UID: \"9c9331a0-3676-4106-957f-5699d256f0d6\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb" Oct 09 19:40:47 crc kubenswrapper[4907]: I1009 19:40:47.939524 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb" Oct 09 19:40:48 crc kubenswrapper[4907]: I1009 19:40:48.414895 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb"] Oct 09 19:40:48 crc kubenswrapper[4907]: I1009 19:40:48.580782 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb" event={"ID":"9c9331a0-3676-4106-957f-5699d256f0d6","Type":"ContainerStarted","Data":"9b6ab802c1565ff69b56a4d8ec2d2189e3cab3bf039772c52267e6c759b57031"} Oct 09 19:40:52 crc kubenswrapper[4907]: I1009 19:40:52.606087 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb" event={"ID":"9c9331a0-3676-4106-957f-5699d256f0d6","Type":"ContainerStarted","Data":"56fefc5863c6d91614da76b1c044e420878b3dd95c91caa09e5d2081af458a99"} Oct 09 19:40:52 crc kubenswrapper[4907]: I1009 19:40:52.637653 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-2g9nb" podStartSLOduration=1.773067137 podStartE2EDuration="5.637632958s" podCreationTimestamp="2025-10-09 19:40:47 +0000 UTC" firstStartedPulling="2025-10-09 19:40:48.425138258 +0000 UTC m=+733.957105747" lastFinishedPulling="2025-10-09 19:40:52.289704069 +0000 UTC m=+737.821671568" observedRunningTime="2025-10-09 19:40:52.631702946 +0000 UTC m=+738.163670495" watchObservedRunningTime="2025-10-09 19:40:52.637632958 +0000 UTC m=+738.169600447" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.510604 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.511988 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.514185 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-prcxp" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.516673 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.517249 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.519787 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.525226 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.530945 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-92f76"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.531781 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.557342 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.645310 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.646382 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.648786 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.649263 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-76r2m" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.649420 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.654266 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.665572 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/71ed6ce6-21bd-4132-9ba3-d344520de4a9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-n97k6\" (UID: \"71ed6ce6-21bd-4132-9ba3-d344520de4a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.665728 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwnr2\" (UniqueName: \"kubernetes.io/projected/71ed6ce6-21bd-4132-9ba3-d344520de4a9-kube-api-access-lwnr2\") pod \"nmstate-webhook-6cdbc54649-n97k6\" (UID: \"71ed6ce6-21bd-4132-9ba3-d344520de4a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.665864 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2p2j\" (UniqueName: \"kubernetes.io/projected/bda49bd2-44dc-4a59-becb-c3942059ab4d-kube-api-access-c2p2j\") pod \"nmstate-metrics-fdff9cb8d-pfsd4\" (UID: \"bda49bd2-44dc-4a59-becb-c3942059ab4d\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.665953 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-ovs-socket\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.666049 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-nmstate-lock\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.666094 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzjxn\" (UniqueName: \"kubernetes.io/projected/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-kube-api-access-lzjxn\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.666232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-dbus-socket\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767519 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-ovs-socket\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvllc\" (UniqueName: \"kubernetes.io/projected/d33978d9-506b-49de-ad7c-d4fd0cb80c79-kube-api-access-mvllc\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767620 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d33978d9-506b-49de-ad7c-d4fd0cb80c79-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767631 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-ovs-socket\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767651 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-nmstate-lock\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767677 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzjxn\" (UniqueName: \"kubernetes.io/projected/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-kube-api-access-lzjxn\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767722 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d33978d9-506b-49de-ad7c-d4fd0cb80c79-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767757 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-dbus-socket\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767798 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/71ed6ce6-21bd-4132-9ba3-d344520de4a9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-n97k6\" (UID: \"71ed6ce6-21bd-4132-9ba3-d344520de4a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767833 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwnr2\" (UniqueName: \"kubernetes.io/projected/71ed6ce6-21bd-4132-9ba3-d344520de4a9-kube-api-access-lwnr2\") pod \"nmstate-webhook-6cdbc54649-n97k6\" (UID: \"71ed6ce6-21bd-4132-9ba3-d344520de4a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767898 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2p2j\" (UniqueName: \"kubernetes.io/projected/bda49bd2-44dc-4a59-becb-c3942059ab4d-kube-api-access-c2p2j\") pod \"nmstate-metrics-fdff9cb8d-pfsd4\" (UID: \"bda49bd2-44dc-4a59-becb-c3942059ab4d\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.767721 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-nmstate-lock\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.768292 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-dbus-socket\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.776130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/71ed6ce6-21bd-4132-9ba3-d344520de4a9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-n97k6\" (UID: \"71ed6ce6-21bd-4132-9ba3-d344520de4a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.796360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwnr2\" (UniqueName: \"kubernetes.io/projected/71ed6ce6-21bd-4132-9ba3-d344520de4a9-kube-api-access-lwnr2\") pod \"nmstate-webhook-6cdbc54649-n97k6\" (UID: \"71ed6ce6-21bd-4132-9ba3-d344520de4a9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.796413 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2p2j\" (UniqueName: \"kubernetes.io/projected/bda49bd2-44dc-4a59-becb-c3942059ab4d-kube-api-access-c2p2j\") pod \"nmstate-metrics-fdff9cb8d-pfsd4\" (UID: \"bda49bd2-44dc-4a59-becb-c3942059ab4d\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.803197 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzjxn\" (UniqueName: \"kubernetes.io/projected/a9bc1d23-a1e5-4879-ad7d-635639d6cb12-kube-api-access-lzjxn\") pod \"nmstate-handler-92f76\" (UID: \"a9bc1d23-a1e5-4879-ad7d-635639d6cb12\") " pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.840989 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.858101 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-686dc57ddf-qgb9b"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.859233 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.868743 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d33978d9-506b-49de-ad7c-d4fd0cb80c79-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.868842 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvllc\" (UniqueName: \"kubernetes.io/projected/d33978d9-506b-49de-ad7c-d4fd0cb80c79-kube-api-access-mvllc\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.868876 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d33978d9-506b-49de-ad7c-d4fd0cb80c79-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.868992 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.870513 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d33978d9-506b-49de-ad7c-d4fd0cb80c79-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.878362 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d33978d9-506b-49de-ad7c-d4fd0cb80c79-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.881788 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-686dc57ddf-qgb9b"] Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.887150 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.891185 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvllc\" (UniqueName: \"kubernetes.io/projected/d33978d9-506b-49de-ad7c-d4fd0cb80c79-kube-api-access-mvllc\") pod \"nmstate-console-plugin-6b874cbd85-2pltk\" (UID: \"d33978d9-506b-49de-ad7c-d4fd0cb80c79\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.965927 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.969669 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-trusted-ca-bundle\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.969752 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-serving-cert\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.969788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-config\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.969834 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-oauth-config\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.969875 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x6rg\" (UniqueName: \"kubernetes.io/projected/4358f40c-5c76-4f86-a2f4-709f88552a7b-kube-api-access-4x6rg\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.969901 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-service-ca\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:53 crc kubenswrapper[4907]: I1009 19:40:53.969974 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-oauth-serving-cert\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.071422 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-trusted-ca-bundle\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.071479 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-serving-cert\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.071497 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-config\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.071524 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-oauth-config\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.071554 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x6rg\" (UniqueName: \"kubernetes.io/projected/4358f40c-5c76-4f86-a2f4-709f88552a7b-kube-api-access-4x6rg\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.071576 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-service-ca\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.071613 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-oauth-serving-cert\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.072563 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-oauth-serving-cert\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.073442 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-config\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.074101 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-trusted-ca-bundle\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.074534 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4358f40c-5c76-4f86-a2f4-709f88552a7b-service-ca\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.074642 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6"] Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.078163 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-serving-cert\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.079068 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4358f40c-5c76-4f86-a2f4-709f88552a7b-console-oauth-config\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.091330 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x6rg\" (UniqueName: \"kubernetes.io/projected/4358f40c-5c76-4f86-a2f4-709f88552a7b-kube-api-access-4x6rg\") pod \"console-686dc57ddf-qgb9b\" (UID: \"4358f40c-5c76-4f86-a2f4-709f88552a7b\") " pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.120539 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4"] Oct 09 19:40:54 crc kubenswrapper[4907]: W1009 19:40:54.131057 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda49bd2_44dc_4a59_becb_c3942059ab4d.slice/crio-1f6bae59c476bc9eae3ec023f1b711272f74f0762eedb78ffb77ce7b8fa0df1a WatchSource:0}: Error finding container 1f6bae59c476bc9eae3ec023f1b711272f74f0762eedb78ffb77ce7b8fa0df1a: Status 404 returned error can't find the container with id 1f6bae59c476bc9eae3ec023f1b711272f74f0762eedb78ffb77ce7b8fa0df1a Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.166118 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk"] Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.219123 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.442902 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-686dc57ddf-qgb9b"] Oct 09 19:40:54 crc kubenswrapper[4907]: W1009 19:40:54.456637 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4358f40c_5c76_4f86_a2f4_709f88552a7b.slice/crio-8bb78bd0b9e4d983945fcc240e669d978b32a73dffe39adc94b9e7d8ea48e580 WatchSource:0}: Error finding container 8bb78bd0b9e4d983945fcc240e669d978b32a73dffe39adc94b9e7d8ea48e580: Status 404 returned error can't find the container with id 8bb78bd0b9e4d983945fcc240e669d978b32a73dffe39adc94b9e7d8ea48e580 Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.622556 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" event={"ID":"71ed6ce6-21bd-4132-9ba3-d344520de4a9","Type":"ContainerStarted","Data":"5d6d9f81273ba721580180b423b553b024c67b79d31f58fbfe4f0e2a8c6b668d"} Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.624383 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-686dc57ddf-qgb9b" event={"ID":"4358f40c-5c76-4f86-a2f4-709f88552a7b","Type":"ContainerStarted","Data":"b34bb782819126cba59d2addde793acd76675d13a3a4876eb2d0be75997393ac"} Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.624430 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-686dc57ddf-qgb9b" event={"ID":"4358f40c-5c76-4f86-a2f4-709f88552a7b","Type":"ContainerStarted","Data":"8bb78bd0b9e4d983945fcc240e669d978b32a73dffe39adc94b9e7d8ea48e580"} Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.626842 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" event={"ID":"d33978d9-506b-49de-ad7c-d4fd0cb80c79","Type":"ContainerStarted","Data":"6426907a085d226f2ededffe0b222988b9047d3de54781988fae497cbef0c0be"} Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.628850 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-92f76" event={"ID":"a9bc1d23-a1e5-4879-ad7d-635639d6cb12","Type":"ContainerStarted","Data":"5a583e83603a0993306cb1a65253737fb73a1b0ad6ca2955b2520279840ffa96"} Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.629905 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" event={"ID":"bda49bd2-44dc-4a59-becb-c3942059ab4d","Type":"ContainerStarted","Data":"1f6bae59c476bc9eae3ec023f1b711272f74f0762eedb78ffb77ce7b8fa0df1a"} Oct 09 19:40:54 crc kubenswrapper[4907]: I1009 19:40:54.649193 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-686dc57ddf-qgb9b" podStartSLOduration=1.649167136 podStartE2EDuration="1.649167136s" podCreationTimestamp="2025-10-09 19:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:40:54.645932383 +0000 UTC m=+740.177899902" watchObservedRunningTime="2025-10-09 19:40:54.649167136 +0000 UTC m=+740.181134645" Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.651097 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" event={"ID":"bda49bd2-44dc-4a59-becb-c3942059ab4d","Type":"ContainerStarted","Data":"29577c2f86413b79a8b3b6bf615ae562f6b579b91e4053e4c4b1a995a136b574"} Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.653098 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" event={"ID":"71ed6ce6-21bd-4132-9ba3-d344520de4a9","Type":"ContainerStarted","Data":"67f4d46e0be5818c827d84d271147ee5f599d5f62f47353d1a9b78bfa24bb127"} Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.653234 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.655056 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" event={"ID":"d33978d9-506b-49de-ad7c-d4fd0cb80c79","Type":"ContainerStarted","Data":"83ac4b40b55b2a93399c1a9cc6e1a1e56f2131303ee352fd91146d0a61e08efd"} Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.657066 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-92f76" event={"ID":"a9bc1d23-a1e5-4879-ad7d-635639d6cb12","Type":"ContainerStarted","Data":"7619028ce2587fcb48a2abbd137d70828b39d1a080e3b71cc5543453f5384b0a"} Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.657150 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.680740 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" podStartSLOduration=1.89390063 podStartE2EDuration="4.680716791s" podCreationTimestamp="2025-10-09 19:40:53 +0000 UTC" firstStartedPulling="2025-10-09 19:40:54.081062482 +0000 UTC m=+739.613029981" lastFinishedPulling="2025-10-09 19:40:56.867878643 +0000 UTC m=+742.399846142" observedRunningTime="2025-10-09 19:40:57.676725509 +0000 UTC m=+743.208693058" watchObservedRunningTime="2025-10-09 19:40:57.680716791 +0000 UTC m=+743.212684300" Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.704440 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-2pltk" podStartSLOduration=2.031004875 podStartE2EDuration="4.704421789s" podCreationTimestamp="2025-10-09 19:40:53 +0000 UTC" firstStartedPulling="2025-10-09 19:40:54.174670592 +0000 UTC m=+739.706638081" lastFinishedPulling="2025-10-09 19:40:56.848087496 +0000 UTC m=+742.380054995" observedRunningTime="2025-10-09 19:40:57.696063305 +0000 UTC m=+743.228030834" watchObservedRunningTime="2025-10-09 19:40:57.704421789 +0000 UTC m=+743.236389278" Oct 09 19:40:57 crc kubenswrapper[4907]: I1009 19:40:57.717450 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-92f76" podStartSLOduration=1.790488708 podStartE2EDuration="4.717434632s" podCreationTimestamp="2025-10-09 19:40:53 +0000 UTC" firstStartedPulling="2025-10-09 19:40:53.928210103 +0000 UTC m=+739.460177592" lastFinishedPulling="2025-10-09 19:40:56.855156017 +0000 UTC m=+742.387123516" observedRunningTime="2025-10-09 19:40:57.714196759 +0000 UTC m=+743.246164248" watchObservedRunningTime="2025-10-09 19:40:57.717434632 +0000 UTC m=+743.249402121" Oct 09 19:40:59 crc kubenswrapper[4907]: I1009 19:40:59.445957 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m47fb"] Oct 09 19:40:59 crc kubenswrapper[4907]: I1009 19:40:59.446241 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" podUID="5c696fd0-572e-4fcf-bd2b-66cda008888b" containerName="controller-manager" containerID="cri-o://fcdf5602d765b0a817118acb3e3f83dc7ee26f8254d5b19b660aa69ecc2937f7" gracePeriod=30 Oct 09 19:40:59 crc kubenswrapper[4907]: I1009 19:40:59.543103 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf"] Oct 09 19:40:59 crc kubenswrapper[4907]: I1009 19:40:59.543307 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" podUID="65313d6f-23ee-4269-ad8c-140fb200c3e5" containerName="route-controller-manager" containerID="cri-o://fc4878719368c7642a684a882d60430af735763ccffc25a466d27ae17994fc3b" gracePeriod=30 Oct 09 19:40:59 crc kubenswrapper[4907]: I1009 19:40:59.676115 4907 generic.go:334] "Generic (PLEG): container finished" podID="5c696fd0-572e-4fcf-bd2b-66cda008888b" containerID="fcdf5602d765b0a817118acb3e3f83dc7ee26f8254d5b19b660aa69ecc2937f7" exitCode=0 Oct 09 19:40:59 crc kubenswrapper[4907]: I1009 19:40:59.676181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" event={"ID":"5c696fd0-572e-4fcf-bd2b-66cda008888b","Type":"ContainerDied","Data":"fcdf5602d765b0a817118acb3e3f83dc7ee26f8254d5b19b660aa69ecc2937f7"} Oct 09 19:40:59 crc kubenswrapper[4907]: I1009 19:40:59.679206 4907 generic.go:334] "Generic (PLEG): container finished" podID="65313d6f-23ee-4269-ad8c-140fb200c3e5" containerID="fc4878719368c7642a684a882d60430af735763ccffc25a466d27ae17994fc3b" exitCode=0 Oct 09 19:40:59 crc kubenswrapper[4907]: I1009 19:40:59.679241 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" event={"ID":"65313d6f-23ee-4269-ad8c-140fb200c3e5","Type":"ContainerDied","Data":"fc4878719368c7642a684a882d60430af735763ccffc25a466d27ae17994fc3b"} Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.113835 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.117954 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261082 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c696fd0-572e-4fcf-bd2b-66cda008888b-serving-cert\") pod \"5c696fd0-572e-4fcf-bd2b-66cda008888b\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261149 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65313d6f-23ee-4269-ad8c-140fb200c3e5-serving-cert\") pod \"65313d6f-23ee-4269-ad8c-140fb200c3e5\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261188 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwxcr\" (UniqueName: \"kubernetes.io/projected/65313d6f-23ee-4269-ad8c-140fb200c3e5-kube-api-access-jwxcr\") pod \"65313d6f-23ee-4269-ad8c-140fb200c3e5\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261254 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-proxy-ca-bundles\") pod \"5c696fd0-572e-4fcf-bd2b-66cda008888b\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq6x9\" (UniqueName: \"kubernetes.io/projected/5c696fd0-572e-4fcf-bd2b-66cda008888b-kube-api-access-gq6x9\") pod \"5c696fd0-572e-4fcf-bd2b-66cda008888b\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261315 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-config\") pod \"5c696fd0-572e-4fcf-bd2b-66cda008888b\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261348 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-client-ca\") pod \"65313d6f-23ee-4269-ad8c-140fb200c3e5\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261372 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-config\") pod \"65313d6f-23ee-4269-ad8c-140fb200c3e5\" (UID: \"65313d6f-23ee-4269-ad8c-140fb200c3e5\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.261413 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-client-ca\") pod \"5c696fd0-572e-4fcf-bd2b-66cda008888b\" (UID: \"5c696fd0-572e-4fcf-bd2b-66cda008888b\") " Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.262833 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5c696fd0-572e-4fcf-bd2b-66cda008888b" (UID: "5c696fd0-572e-4fcf-bd2b-66cda008888b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.263570 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-config" (OuterVolumeSpecName: "config") pod "65313d6f-23ee-4269-ad8c-140fb200c3e5" (UID: "65313d6f-23ee-4269-ad8c-140fb200c3e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.263587 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-config" (OuterVolumeSpecName: "config") pod "5c696fd0-572e-4fcf-bd2b-66cda008888b" (UID: "5c696fd0-572e-4fcf-bd2b-66cda008888b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.263849 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-client-ca" (OuterVolumeSpecName: "client-ca") pod "5c696fd0-572e-4fcf-bd2b-66cda008888b" (UID: "5c696fd0-572e-4fcf-bd2b-66cda008888b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.266041 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-client-ca" (OuterVolumeSpecName: "client-ca") pod "65313d6f-23ee-4269-ad8c-140fb200c3e5" (UID: "65313d6f-23ee-4269-ad8c-140fb200c3e5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.269070 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65313d6f-23ee-4269-ad8c-140fb200c3e5-kube-api-access-jwxcr" (OuterVolumeSpecName: "kube-api-access-jwxcr") pod "65313d6f-23ee-4269-ad8c-140fb200c3e5" (UID: "65313d6f-23ee-4269-ad8c-140fb200c3e5"). InnerVolumeSpecName "kube-api-access-jwxcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.269253 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c696fd0-572e-4fcf-bd2b-66cda008888b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5c696fd0-572e-4fcf-bd2b-66cda008888b" (UID: "5c696fd0-572e-4fcf-bd2b-66cda008888b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.269564 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65313d6f-23ee-4269-ad8c-140fb200c3e5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "65313d6f-23ee-4269-ad8c-140fb200c3e5" (UID: "65313d6f-23ee-4269-ad8c-140fb200c3e5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.269671 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c696fd0-572e-4fcf-bd2b-66cda008888b-kube-api-access-gq6x9" (OuterVolumeSpecName: "kube-api-access-gq6x9") pod "5c696fd0-572e-4fcf-bd2b-66cda008888b" (UID: "5c696fd0-572e-4fcf-bd2b-66cda008888b"). InnerVolumeSpecName "kube-api-access-gq6x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363313 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363370 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65313d6f-23ee-4269-ad8c-140fb200c3e5-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363389 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363405 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c696fd0-572e-4fcf-bd2b-66cda008888b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363422 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65313d6f-23ee-4269-ad8c-140fb200c3e5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363442 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwxcr\" (UniqueName: \"kubernetes.io/projected/65313d6f-23ee-4269-ad8c-140fb200c3e5-kube-api-access-jwxcr\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363484 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363503 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c696fd0-572e-4fcf-bd2b-66cda008888b-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.363525 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq6x9\" (UniqueName: \"kubernetes.io/projected/5c696fd0-572e-4fcf-bd2b-66cda008888b-kube-api-access-gq6x9\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.688331 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" event={"ID":"bda49bd2-44dc-4a59-becb-c3942059ab4d","Type":"ContainerStarted","Data":"6a6738f52a47d6747b1d79fe6aca8bb561dcd801362c99c3e705f6c90dbeb6f0"} Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.691742 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.691801 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-m47fb" event={"ID":"5c696fd0-572e-4fcf-bd2b-66cda008888b","Type":"ContainerDied","Data":"33c379ddd63c72e96dffc1ca12a2606da46400a532ec962616f95dc8e8876386"} Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.691832 4907 scope.go:117] "RemoveContainer" containerID="fcdf5602d765b0a817118acb3e3f83dc7ee26f8254d5b19b660aa69ecc2937f7" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.729681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" event={"ID":"65313d6f-23ee-4269-ad8c-140fb200c3e5","Type":"ContainerDied","Data":"d95aa5087e5890e84a524f286b89c9beffcdfaa267977044efc5307ef0a89b2a"} Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.729824 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.729954 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pfsd4" podStartSLOduration=1.9561821560000001 podStartE2EDuration="7.72993821s" podCreationTimestamp="2025-10-09 19:40:53 +0000 UTC" firstStartedPulling="2025-10-09 19:40:54.145293118 +0000 UTC m=+739.677260607" lastFinishedPulling="2025-10-09 19:40:59.919049172 +0000 UTC m=+745.451016661" observedRunningTime="2025-10-09 19:41:00.727949279 +0000 UTC m=+746.259916808" watchObservedRunningTime="2025-10-09 19:41:00.72993821 +0000 UTC m=+746.261905729" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.782411 4907 scope.go:117] "RemoveContainer" containerID="fc4878719368c7642a684a882d60430af735763ccffc25a466d27ae17994fc3b" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.793556 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m47fb"] Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.796694 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-m47fb"] Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.817983 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf"] Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.827236 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ngfwf"] Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.921958 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-96c796646-4pk8x"] Oct 09 19:41:00 crc kubenswrapper[4907]: E1009 19:41:00.922203 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65313d6f-23ee-4269-ad8c-140fb200c3e5" containerName="route-controller-manager" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.922224 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="65313d6f-23ee-4269-ad8c-140fb200c3e5" containerName="route-controller-manager" Oct 09 19:41:00 crc kubenswrapper[4907]: E1009 19:41:00.922246 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c696fd0-572e-4fcf-bd2b-66cda008888b" containerName="controller-manager" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.922255 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c696fd0-572e-4fcf-bd2b-66cda008888b" containerName="controller-manager" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.922386 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="65313d6f-23ee-4269-ad8c-140fb200c3e5" containerName="route-controller-manager" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.922410 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c696fd0-572e-4fcf-bd2b-66cda008888b" containerName="controller-manager" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.922904 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.924785 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.924823 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.925149 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.925154 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.925195 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.925669 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.930693 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-96c796646-4pk8x"] Oct 09 19:41:00 crc kubenswrapper[4907]: I1009 19:41:00.933722 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.085746 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxkqz\" (UniqueName: \"kubernetes.io/projected/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-kube-api-access-gxkqz\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.085833 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-config\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.086043 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-client-ca\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.086129 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-serving-cert\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.086194 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-proxy-ca-bundles\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.149875 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj"] Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.150839 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.154169 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.154831 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.154878 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.155099 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.155316 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.154856 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.160625 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c696fd0-572e-4fcf-bd2b-66cda008888b" path="/var/lib/kubelet/pods/5c696fd0-572e-4fcf-bd2b-66cda008888b/volumes" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.162104 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65313d6f-23ee-4269-ad8c-140fb200c3e5" path="/var/lib/kubelet/pods/65313d6f-23ee-4269-ad8c-140fb200c3e5/volumes" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.174219 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj"] Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.194456 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-client-ca\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.194593 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-serving-cert\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.194677 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-proxy-ca-bundles\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.194733 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpqs8\" (UniqueName: \"kubernetes.io/projected/d6c126c6-f4f2-4046-a973-0640c0d1d373-kube-api-access-dpqs8\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.194846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxkqz\" (UniqueName: \"kubernetes.io/projected/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-kube-api-access-gxkqz\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.194936 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c126c6-f4f2-4046-a973-0640c0d1d373-serving-cert\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.194989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-config\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.195079 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6c126c6-f4f2-4046-a973-0640c0d1d373-client-ca\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.195181 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c126c6-f4f2-4046-a973-0640c0d1d373-config\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.196809 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-proxy-ca-bundles\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.197360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-config\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.197628 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-client-ca\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.203881 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-serving-cert\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.213239 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxkqz\" (UniqueName: \"kubernetes.io/projected/ee40c8e2-2bd6-42a6-9f96-52b8a56274db-kube-api-access-gxkqz\") pod \"controller-manager-96c796646-4pk8x\" (UID: \"ee40c8e2-2bd6-42a6-9f96-52b8a56274db\") " pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.242931 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.296491 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpqs8\" (UniqueName: \"kubernetes.io/projected/d6c126c6-f4f2-4046-a973-0640c0d1d373-kube-api-access-dpqs8\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.296622 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c126c6-f4f2-4046-a973-0640c0d1d373-serving-cert\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.296689 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6c126c6-f4f2-4046-a973-0640c0d1d373-client-ca\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.296739 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c126c6-f4f2-4046-a973-0640c0d1d373-config\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.299018 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6c126c6-f4f2-4046-a973-0640c0d1d373-client-ca\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.299543 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c126c6-f4f2-4046-a973-0640c0d1d373-config\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.300861 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6c126c6-f4f2-4046-a973-0640c0d1d373-serving-cert\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.318373 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpqs8\" (UniqueName: \"kubernetes.io/projected/d6c126c6-f4f2-4046-a973-0640c0d1d373-kube-api-access-dpqs8\") pod \"route-controller-manager-7dc7f98775-7pbvj\" (UID: \"d6c126c6-f4f2-4046-a973-0640c0d1d373\") " pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.468793 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.510692 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-96c796646-4pk8x"] Oct 09 19:41:01 crc kubenswrapper[4907]: W1009 19:41:01.513414 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee40c8e2_2bd6_42a6_9f96_52b8a56274db.slice/crio-8a4bae0781124df271a15c7a5efec1c44952b2691a8e4cdbdba087a437523993 WatchSource:0}: Error finding container 8a4bae0781124df271a15c7a5efec1c44952b2691a8e4cdbdba087a437523993: Status 404 returned error can't find the container with id 8a4bae0781124df271a15c7a5efec1c44952b2691a8e4cdbdba087a437523993 Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.691520 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj"] Oct 09 19:41:01 crc kubenswrapper[4907]: W1009 19:41:01.704560 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c126c6_f4f2_4046_a973_0640c0d1d373.slice/crio-737f562b863b740bb66e9e3812ef219edd6efa60cacb3b21ae7597b99a5b2f6f WatchSource:0}: Error finding container 737f562b863b740bb66e9e3812ef219edd6efa60cacb3b21ae7597b99a5b2f6f: Status 404 returned error can't find the container with id 737f562b863b740bb66e9e3812ef219edd6efa60cacb3b21ae7597b99a5b2f6f Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.742683 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" event={"ID":"d6c126c6-f4f2-4046-a973-0640c0d1d373","Type":"ContainerStarted","Data":"737f562b863b740bb66e9e3812ef219edd6efa60cacb3b21ae7597b99a5b2f6f"} Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.744190 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" event={"ID":"ee40c8e2-2bd6-42a6-9f96-52b8a56274db","Type":"ContainerStarted","Data":"66029f36012f47be35ca59c40d094f936ea257f09cc376120eb52c7caed88652"} Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.744217 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" event={"ID":"ee40c8e2-2bd6-42a6-9f96-52b8a56274db","Type":"ContainerStarted","Data":"8a4bae0781124df271a15c7a5efec1c44952b2691a8e4cdbdba087a437523993"} Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.744743 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.745812 4907 patch_prober.go:28] interesting pod/controller-manager-96c796646-4pk8x container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.745839 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" podUID="ee40c8e2-2bd6-42a6-9f96-52b8a56274db" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Oct 09 19:41:01 crc kubenswrapper[4907]: I1009 19:41:01.773563 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" podStartSLOduration=1.773550924 podStartE2EDuration="1.773550924s" podCreationTimestamp="2025-10-09 19:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:41:01.771912142 +0000 UTC m=+747.303879631" watchObservedRunningTime="2025-10-09 19:41:01.773550924 +0000 UTC m=+747.305518413" Oct 09 19:41:02 crc kubenswrapper[4907]: I1009 19:41:02.753565 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" event={"ID":"d6c126c6-f4f2-4046-a973-0640c0d1d373","Type":"ContainerStarted","Data":"0f7bf979fab882a1b90067d70a5eda38da76dd946a321f17dbe967ec86c0b73d"} Oct 09 19:41:02 crc kubenswrapper[4907]: I1009 19:41:02.753856 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:02 crc kubenswrapper[4907]: I1009 19:41:02.759878 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" Oct 09 19:41:02 crc kubenswrapper[4907]: I1009 19:41:02.760343 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-96c796646-4pk8x" Oct 09 19:41:02 crc kubenswrapper[4907]: I1009 19:41:02.781725 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7dc7f98775-7pbvj" podStartSLOduration=3.7817056879999997 podStartE2EDuration="3.781705688s" podCreationTimestamp="2025-10-09 19:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:41:02.776410492 +0000 UTC m=+748.308377991" watchObservedRunningTime="2025-10-09 19:41:02.781705688 +0000 UTC m=+748.313673177" Oct 09 19:41:03 crc kubenswrapper[4907]: I1009 19:41:03.922350 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-92f76" Oct 09 19:41:04 crc kubenswrapper[4907]: I1009 19:41:04.219700 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:41:04 crc kubenswrapper[4907]: I1009 19:41:04.219765 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:41:04 crc kubenswrapper[4907]: I1009 19:41:04.228988 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:41:04 crc kubenswrapper[4907]: I1009 19:41:04.775114 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-686dc57ddf-qgb9b" Oct 09 19:41:04 crc kubenswrapper[4907]: I1009 19:41:04.844248 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2fnwq"] Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.298928 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.299633 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.299685 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.300259 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9652a7dfb693b946f43ed7007125b8bc1aa6768f8074819278bd9dc415f2d69d"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.300311 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://9652a7dfb693b946f43ed7007125b8bc1aa6768f8074819278bd9dc415f2d69d" gracePeriod=600 Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.782228 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="9652a7dfb693b946f43ed7007125b8bc1aa6768f8074819278bd9dc415f2d69d" exitCode=0 Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.782275 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"9652a7dfb693b946f43ed7007125b8bc1aa6768f8074819278bd9dc415f2d69d"} Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.782598 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"e52c7a0fe32a558feb0415aa3280260b781b94dc2a29de298da06be1d8aa2d54"} Oct 09 19:41:06 crc kubenswrapper[4907]: I1009 19:41:06.782614 4907 scope.go:117] "RemoveContainer" containerID="20040c38bb94f5f105635ae1ef3d872533313f11c565c46967e83a901a8d6060" Oct 09 19:41:08 crc kubenswrapper[4907]: I1009 19:41:08.277082 4907 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 19:41:13 crc kubenswrapper[4907]: I1009 19:41:13.851646 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-n97k6" Oct 09 19:41:27 crc kubenswrapper[4907]: I1009 19:41:27.855599 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv"] Oct 09 19:41:27 crc kubenswrapper[4907]: I1009 19:41:27.857195 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:27 crc kubenswrapper[4907]: I1009 19:41:27.859169 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 19:41:27 crc kubenswrapper[4907]: I1009 19:41:27.869854 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv"] Oct 09 19:41:27 crc kubenswrapper[4907]: I1009 19:41:27.942053 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:27 crc kubenswrapper[4907]: I1009 19:41:27.942097 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsxst\" (UniqueName: \"kubernetes.io/projected/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-kube-api-access-gsxst\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:27 crc kubenswrapper[4907]: I1009 19:41:27.942146 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.043499 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.043616 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.043651 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsxst\" (UniqueName: \"kubernetes.io/projected/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-kube-api-access-gsxst\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.044151 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.044216 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.065372 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsxst\" (UniqueName: \"kubernetes.io/projected/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-kube-api-access-gsxst\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.178280 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.645418 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv"] Oct 09 19:41:28 crc kubenswrapper[4907]: W1009 19:41:28.655004 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36417bf7_a6b9_4677_baff_e04cd0e7f1dd.slice/crio-9b57907a5637606b15f03796ff60ba22a3852779cc5bd74ee1b64e2e6e816343 WatchSource:0}: Error finding container 9b57907a5637606b15f03796ff60ba22a3852779cc5bd74ee1b64e2e6e816343: Status 404 returned error can't find the container with id 9b57907a5637606b15f03796ff60ba22a3852779cc5bd74ee1b64e2e6e816343 Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.948385 4907 generic.go:334] "Generic (PLEG): container finished" podID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerID="5a295e2d5747d509fa1b99de262a02b811c67c76e37abab003fe929f4b465421" exitCode=0 Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.948448 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" event={"ID":"36417bf7-a6b9-4677-baff-e04cd0e7f1dd","Type":"ContainerDied","Data":"5a295e2d5747d509fa1b99de262a02b811c67c76e37abab003fe929f4b465421"} Oct 09 19:41:28 crc kubenswrapper[4907]: I1009 19:41:28.948505 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" event={"ID":"36417bf7-a6b9-4677-baff-e04cd0e7f1dd","Type":"ContainerStarted","Data":"9b57907a5637606b15f03796ff60ba22a3852779cc5bd74ee1b64e2e6e816343"} Oct 09 19:41:29 crc kubenswrapper[4907]: I1009 19:41:29.902150 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-2fnwq" podUID="957d72db-4cb4-4e97-bb11-2f25eb03f259" containerName="console" containerID="cri-o://137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da" gracePeriod=15 Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.222983 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hxjr9"] Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.228325 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.237075 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hxjr9"] Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.377169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-catalog-content\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.377262 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkk4x\" (UniqueName: \"kubernetes.io/projected/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-kube-api-access-xkk4x\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.377302 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-utilities\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.478599 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-catalog-content\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.478658 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkk4x\" (UniqueName: \"kubernetes.io/projected/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-kube-api-access-xkk4x\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.478695 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-utilities\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.479212 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-catalog-content\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.479278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-utilities\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.497305 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2fnwq_957d72db-4cb4-4e97-bb11-2f25eb03f259/console/0.log" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.497395 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.501158 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkk4x\" (UniqueName: \"kubernetes.io/projected/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-kube-api-access-xkk4x\") pod \"redhat-operators-hxjr9\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.558260 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579156 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swh9v\" (UniqueName: \"kubernetes.io/projected/957d72db-4cb4-4e97-bb11-2f25eb03f259-kube-api-access-swh9v\") pod \"957d72db-4cb4-4e97-bb11-2f25eb03f259\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579205 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-trusted-ca-bundle\") pod \"957d72db-4cb4-4e97-bb11-2f25eb03f259\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579226 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-oauth-serving-cert\") pod \"957d72db-4cb4-4e97-bb11-2f25eb03f259\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579282 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-config\") pod \"957d72db-4cb4-4e97-bb11-2f25eb03f259\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579359 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-oauth-config\") pod \"957d72db-4cb4-4e97-bb11-2f25eb03f259\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579386 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-serving-cert\") pod \"957d72db-4cb4-4e97-bb11-2f25eb03f259\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579410 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-service-ca\") pod \"957d72db-4cb4-4e97-bb11-2f25eb03f259\" (UID: \"957d72db-4cb4-4e97-bb11-2f25eb03f259\") " Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579878 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-config" (OuterVolumeSpecName: "console-config") pod "957d72db-4cb4-4e97-bb11-2f25eb03f259" (UID: "957d72db-4cb4-4e97-bb11-2f25eb03f259"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.579903 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "957d72db-4cb4-4e97-bb11-2f25eb03f259" (UID: "957d72db-4cb4-4e97-bb11-2f25eb03f259"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.580145 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-service-ca" (OuterVolumeSpecName: "service-ca") pod "957d72db-4cb4-4e97-bb11-2f25eb03f259" (UID: "957d72db-4cb4-4e97-bb11-2f25eb03f259"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.580165 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "957d72db-4cb4-4e97-bb11-2f25eb03f259" (UID: "957d72db-4cb4-4e97-bb11-2f25eb03f259"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.583983 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957d72db-4cb4-4e97-bb11-2f25eb03f259-kube-api-access-swh9v" (OuterVolumeSpecName: "kube-api-access-swh9v") pod "957d72db-4cb4-4e97-bb11-2f25eb03f259" (UID: "957d72db-4cb4-4e97-bb11-2f25eb03f259"). InnerVolumeSpecName "kube-api-access-swh9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.584651 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "957d72db-4cb4-4e97-bb11-2f25eb03f259" (UID: "957d72db-4cb4-4e97-bb11-2f25eb03f259"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.584777 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "957d72db-4cb4-4e97-bb11-2f25eb03f259" (UID: "957d72db-4cb4-4e97-bb11-2f25eb03f259"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.680630 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.680660 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.680670 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.680680 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swh9v\" (UniqueName: \"kubernetes.io/projected/957d72db-4cb4-4e97-bb11-2f25eb03f259-kube-api-access-swh9v\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.680690 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.680699 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.680706 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/957d72db-4cb4-4e97-bb11-2f25eb03f259-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.961512 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2fnwq_957d72db-4cb4-4e97-bb11-2f25eb03f259/console/0.log" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.961560 4907 generic.go:334] "Generic (PLEG): container finished" podID="957d72db-4cb4-4e97-bb11-2f25eb03f259" containerID="137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da" exitCode=2 Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.961626 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2fnwq" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.961635 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2fnwq" event={"ID":"957d72db-4cb4-4e97-bb11-2f25eb03f259","Type":"ContainerDied","Data":"137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da"} Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.961664 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2fnwq" event={"ID":"957d72db-4cb4-4e97-bb11-2f25eb03f259","Type":"ContainerDied","Data":"e1e6183e65f573fa0745f6134a0b61cc0f5e982a080b33c037faebcac17adf4f"} Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.961704 4907 scope.go:117] "RemoveContainer" containerID="137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.963657 4907 generic.go:334] "Generic (PLEG): container finished" podID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerID="9ca98a52f00fff3d555b7daf7edc3bf7007e03cdd98f2e018e170bf802113411" exitCode=0 Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.963700 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" event={"ID":"36417bf7-a6b9-4677-baff-e04cd0e7f1dd","Type":"ContainerDied","Data":"9ca98a52f00fff3d555b7daf7edc3bf7007e03cdd98f2e018e170bf802113411"} Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.988987 4907 scope.go:117] "RemoveContainer" containerID="137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da" Oct 09 19:41:30 crc kubenswrapper[4907]: E1009 19:41:30.989459 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da\": container with ID starting with 137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da not found: ID does not exist" containerID="137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da" Oct 09 19:41:30 crc kubenswrapper[4907]: I1009 19:41:30.989531 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da"} err="failed to get container status \"137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da\": rpc error: code = NotFound desc = could not find container \"137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da\": container with ID starting with 137efcd15da644497909372355916a12eea907743d99b3e1c96ac734efbeb5da not found: ID does not exist" Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.009027 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hxjr9"] Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.024309 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2fnwq"] Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.030505 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-2fnwq"] Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.160388 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957d72db-4cb4-4e97-bb11-2f25eb03f259" path="/var/lib/kubelet/pods/957d72db-4cb4-4e97-bb11-2f25eb03f259/volumes" Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.973622 4907 generic.go:334] "Generic (PLEG): container finished" podID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerID="a943ae34effc3d48ef1d4ac7efab91df08bd50da5c9fb9764758d9182d97814f" exitCode=0 Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.973682 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" event={"ID":"36417bf7-a6b9-4677-baff-e04cd0e7f1dd","Type":"ContainerDied","Data":"a943ae34effc3d48ef1d4ac7efab91df08bd50da5c9fb9764758d9182d97814f"} Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.976697 4907 generic.go:334] "Generic (PLEG): container finished" podID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerID="4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d" exitCode=0 Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.976746 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxjr9" event={"ID":"1dcf26be-966b-4d09-9bdd-5d2a52da8b05","Type":"ContainerDied","Data":"4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d"} Oct 09 19:41:31 crc kubenswrapper[4907]: I1009 19:41:31.976779 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxjr9" event={"ID":"1dcf26be-966b-4d09-9bdd-5d2a52da8b05","Type":"ContainerStarted","Data":"0e42a9d06e3fb4366db3edd1f551470868afaf1d552f160ffcd22de9a3982b61"} Oct 09 19:41:32 crc kubenswrapper[4907]: I1009 19:41:32.987655 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxjr9" event={"ID":"1dcf26be-966b-4d09-9bdd-5d2a52da8b05","Type":"ContainerStarted","Data":"c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40"} Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.396743 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.523180 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-util\") pod \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.523278 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-bundle\") pod \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.523436 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsxst\" (UniqueName: \"kubernetes.io/projected/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-kube-api-access-gsxst\") pod \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\" (UID: \"36417bf7-a6b9-4677-baff-e04cd0e7f1dd\") " Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.525337 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-bundle" (OuterVolumeSpecName: "bundle") pod "36417bf7-a6b9-4677-baff-e04cd0e7f1dd" (UID: "36417bf7-a6b9-4677-baff-e04cd0e7f1dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.536731 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-kube-api-access-gsxst" (OuterVolumeSpecName: "kube-api-access-gsxst") pod "36417bf7-a6b9-4677-baff-e04cd0e7f1dd" (UID: "36417bf7-a6b9-4677-baff-e04cd0e7f1dd"). InnerVolumeSpecName "kube-api-access-gsxst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.541327 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-util" (OuterVolumeSpecName: "util") pod "36417bf7-a6b9-4677-baff-e04cd0e7f1dd" (UID: "36417bf7-a6b9-4677-baff-e04cd0e7f1dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.624508 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsxst\" (UniqueName: \"kubernetes.io/projected/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-kube-api-access-gsxst\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.624559 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-util\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.624581 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36417bf7-a6b9-4677-baff-e04cd0e7f1dd-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.996388 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" event={"ID":"36417bf7-a6b9-4677-baff-e04cd0e7f1dd","Type":"ContainerDied","Data":"9b57907a5637606b15f03796ff60ba22a3852779cc5bd74ee1b64e2e6e816343"} Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.996454 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b57907a5637606b15f03796ff60ba22a3852779cc5bd74ee1b64e2e6e816343" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.996407 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv" Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.998625 4907 generic.go:334] "Generic (PLEG): container finished" podID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerID="c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40" exitCode=0 Oct 09 19:41:33 crc kubenswrapper[4907]: I1009 19:41:33.998717 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxjr9" event={"ID":"1dcf26be-966b-4d09-9bdd-5d2a52da8b05","Type":"ContainerDied","Data":"c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40"} Oct 09 19:41:35 crc kubenswrapper[4907]: I1009 19:41:35.006626 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxjr9" event={"ID":"1dcf26be-966b-4d09-9bdd-5d2a52da8b05","Type":"ContainerStarted","Data":"e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649"} Oct 09 19:41:35 crc kubenswrapper[4907]: I1009 19:41:35.034403 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hxjr9" podStartSLOduration=2.542827754 podStartE2EDuration="5.034379566s" podCreationTimestamp="2025-10-09 19:41:30 +0000 UTC" firstStartedPulling="2025-10-09 19:41:31.978152178 +0000 UTC m=+777.510119677" lastFinishedPulling="2025-10-09 19:41:34.46970399 +0000 UTC m=+780.001671489" observedRunningTime="2025-10-09 19:41:35.034005107 +0000 UTC m=+780.565972656" watchObservedRunningTime="2025-10-09 19:41:35.034379566 +0000 UTC m=+780.566347065" Oct 09 19:41:40 crc kubenswrapper[4907]: I1009 19:41:40.558860 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:40 crc kubenswrapper[4907]: I1009 19:41:40.559558 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:40 crc kubenswrapper[4907]: I1009 19:41:40.665806 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:41 crc kubenswrapper[4907]: I1009 19:41:41.089991 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:42 crc kubenswrapper[4907]: I1009 19:41:42.013019 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hxjr9"] Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.059082 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hxjr9" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerName="registry-server" containerID="cri-o://e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649" gracePeriod=2 Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.449945 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-55785946f8-t68kr"] Oct 09 19:41:43 crc kubenswrapper[4907]: E1009 19:41:43.450204 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerName="pull" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.450217 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerName="pull" Oct 09 19:41:43 crc kubenswrapper[4907]: E1009 19:41:43.450228 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerName="util" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.450236 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerName="util" Oct 09 19:41:43 crc kubenswrapper[4907]: E1009 19:41:43.450245 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerName="extract" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.450252 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerName="extract" Oct 09 19:41:43 crc kubenswrapper[4907]: E1009 19:41:43.450271 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957d72db-4cb4-4e97-bb11-2f25eb03f259" containerName="console" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.450278 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="957d72db-4cb4-4e97-bb11-2f25eb03f259" containerName="console" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.450404 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="957d72db-4cb4-4e97-bb11-2f25eb03f259" containerName="console" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.450418 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="36417bf7-a6b9-4677-baff-e04cd0e7f1dd" containerName="extract" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.450998 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.457168 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.457457 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.457830 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.458005 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tbxwc" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.460927 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.465265 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55785946f8-t68kr"] Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.552114 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e6d0933-34d0-4cf2-bc08-75d11b13e618-webhook-cert\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.552169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e6d0933-34d0-4cf2-bc08-75d11b13e618-apiservice-cert\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.552255 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqs5f\" (UniqueName: \"kubernetes.io/projected/5e6d0933-34d0-4cf2-bc08-75d11b13e618-kube-api-access-mqs5f\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.653475 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e6d0933-34d0-4cf2-bc08-75d11b13e618-webhook-cert\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.653534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e6d0933-34d0-4cf2-bc08-75d11b13e618-apiservice-cert\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.653592 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqs5f\" (UniqueName: \"kubernetes.io/projected/5e6d0933-34d0-4cf2-bc08-75d11b13e618-kube-api-access-mqs5f\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.660283 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.660766 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e6d0933-34d0-4cf2-bc08-75d11b13e618-apiservice-cert\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.661593 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e6d0933-34d0-4cf2-bc08-75d11b13e618-webhook-cert\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.669713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqs5f\" (UniqueName: \"kubernetes.io/projected/5e6d0933-34d0-4cf2-bc08-75d11b13e618-kube-api-access-mqs5f\") pod \"metallb-operator-controller-manager-55785946f8-t68kr\" (UID: \"5e6d0933-34d0-4cf2-bc08-75d11b13e618\") " pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.754949 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-catalog-content\") pod \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.755015 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-utilities\") pod \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.755054 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkk4x\" (UniqueName: \"kubernetes.io/projected/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-kube-api-access-xkk4x\") pod \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\" (UID: \"1dcf26be-966b-4d09-9bdd-5d2a52da8b05\") " Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.755795 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-utilities" (OuterVolumeSpecName: "utilities") pod "1dcf26be-966b-4d09-9bdd-5d2a52da8b05" (UID: "1dcf26be-966b-4d09-9bdd-5d2a52da8b05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.758322 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-kube-api-access-xkk4x" (OuterVolumeSpecName: "kube-api-access-xkk4x") pod "1dcf26be-966b-4d09-9bdd-5d2a52da8b05" (UID: "1dcf26be-966b-4d09-9bdd-5d2a52da8b05"). InnerVolumeSpecName "kube-api-access-xkk4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.792196 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q"] Oct 09 19:41:43 crc kubenswrapper[4907]: E1009 19:41:43.792450 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerName="registry-server" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.792479 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerName="registry-server" Oct 09 19:41:43 crc kubenswrapper[4907]: E1009 19:41:43.792500 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerName="extract-utilities" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.792508 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerName="extract-utilities" Oct 09 19:41:43 crc kubenswrapper[4907]: E1009 19:41:43.792530 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerName="extract-content" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.792538 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerName="extract-content" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.792655 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerName="registry-server" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.793145 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.794330 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.798891 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.798924 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.799130 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8fj9c" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.811048 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q"] Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.856312 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.856342 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkk4x\" (UniqueName: \"kubernetes.io/projected/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-kube-api-access-xkk4x\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.877743 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dcf26be-966b-4d09-9bdd-5d2a52da8b05" (UID: "1dcf26be-966b-4d09-9bdd-5d2a52da8b05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.958010 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-webhook-cert\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.958064 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sqqv\" (UniqueName: \"kubernetes.io/projected/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-kube-api-access-7sqqv\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.958085 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-apiservice-cert\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:43 crc kubenswrapper[4907]: I1009 19:41:43.958117 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcf26be-966b-4d09-9bdd-5d2a52da8b05-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.059436 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sqqv\" (UniqueName: \"kubernetes.io/projected/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-kube-api-access-7sqqv\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.059493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-apiservice-cert\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.059561 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-webhook-cert\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.063381 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-webhook-cert\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.063855 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-apiservice-cert\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.065595 4907 generic.go:334] "Generic (PLEG): container finished" podID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" containerID="e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649" exitCode=0 Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.065646 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxjr9" event={"ID":"1dcf26be-966b-4d09-9bdd-5d2a52da8b05","Type":"ContainerDied","Data":"e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649"} Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.065672 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxjr9" event={"ID":"1dcf26be-966b-4d09-9bdd-5d2a52da8b05","Type":"ContainerDied","Data":"0e42a9d06e3fb4366db3edd1f551470868afaf1d552f160ffcd22de9a3982b61"} Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.065692 4907 scope.go:117] "RemoveContainer" containerID="e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.065845 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxjr9" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.080076 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sqqv\" (UniqueName: \"kubernetes.io/projected/7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b-kube-api-access-7sqqv\") pod \"metallb-operator-webhook-server-5c6994cd9d-kw99q\" (UID: \"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b\") " pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.082110 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-55785946f8-t68kr"] Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.086629 4907 scope.go:117] "RemoveContainer" containerID="c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40" Oct 09 19:41:44 crc kubenswrapper[4907]: W1009 19:41:44.099895 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6d0933_34d0_4cf2_bc08_75d11b13e618.slice/crio-1ba59fb8f18c04b4fbb1ddd2a4db738364eb451894ffdbd95d1b355040b69cff WatchSource:0}: Error finding container 1ba59fb8f18c04b4fbb1ddd2a4db738364eb451894ffdbd95d1b355040b69cff: Status 404 returned error can't find the container with id 1ba59fb8f18c04b4fbb1ddd2a4db738364eb451894ffdbd95d1b355040b69cff Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.102301 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hxjr9"] Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.109085 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hxjr9"] Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.115547 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.120738 4907 scope.go:117] "RemoveContainer" containerID="4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.141136 4907 scope.go:117] "RemoveContainer" containerID="e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649" Oct 09 19:41:44 crc kubenswrapper[4907]: E1009 19:41:44.141453 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649\": container with ID starting with e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649 not found: ID does not exist" containerID="e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.141494 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649"} err="failed to get container status \"e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649\": rpc error: code = NotFound desc = could not find container \"e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649\": container with ID starting with e4ad2a096db42856c219f8af6faaee4bf3478962b66d9f18eff20fb204d81649 not found: ID does not exist" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.141515 4907 scope.go:117] "RemoveContainer" containerID="c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40" Oct 09 19:41:44 crc kubenswrapper[4907]: E1009 19:41:44.141842 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40\": container with ID starting with c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40 not found: ID does not exist" containerID="c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.141896 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40"} err="failed to get container status \"c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40\": rpc error: code = NotFound desc = could not find container \"c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40\": container with ID starting with c7ff03a431d339329381811fd16f3365cc2bb6f9a20b17d1fc9a1dff323bbb40 not found: ID does not exist" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.141914 4907 scope.go:117] "RemoveContainer" containerID="4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d" Oct 09 19:41:44 crc kubenswrapper[4907]: E1009 19:41:44.142123 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d\": container with ID starting with 4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d not found: ID does not exist" containerID="4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.142144 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d"} err="failed to get container status \"4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d\": rpc error: code = NotFound desc = could not find container \"4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d\": container with ID starting with 4b5d725847060344daf8c8fc0f449377b25beb71e8745fced07f3823f177251d not found: ID does not exist" Oct 09 19:41:44 crc kubenswrapper[4907]: I1009 19:41:44.584630 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q"] Oct 09 19:41:44 crc kubenswrapper[4907]: W1009 19:41:44.594203 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7820e44f_78bd_4549_8f4b_d7f2ec3b2b1b.slice/crio-e20e5f9a96309b8a035c524cb60f312a98de910ea67aeda9baaeffbe5ce5f9c6 WatchSource:0}: Error finding container e20e5f9a96309b8a035c524cb60f312a98de910ea67aeda9baaeffbe5ce5f9c6: Status 404 returned error can't find the container with id e20e5f9a96309b8a035c524cb60f312a98de910ea67aeda9baaeffbe5ce5f9c6 Oct 09 19:41:45 crc kubenswrapper[4907]: I1009 19:41:45.071572 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" event={"ID":"5e6d0933-34d0-4cf2-bc08-75d11b13e618","Type":"ContainerStarted","Data":"1ba59fb8f18c04b4fbb1ddd2a4db738364eb451894ffdbd95d1b355040b69cff"} Oct 09 19:41:45 crc kubenswrapper[4907]: I1009 19:41:45.073720 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" event={"ID":"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b","Type":"ContainerStarted","Data":"e20e5f9a96309b8a035c524cb60f312a98de910ea67aeda9baaeffbe5ce5f9c6"} Oct 09 19:41:45 crc kubenswrapper[4907]: I1009 19:41:45.159285 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcf26be-966b-4d09-9bdd-5d2a52da8b05" path="/var/lib/kubelet/pods/1dcf26be-966b-4d09-9bdd-5d2a52da8b05/volumes" Oct 09 19:41:52 crc kubenswrapper[4907]: I1009 19:41:52.134834 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" event={"ID":"7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b","Type":"ContainerStarted","Data":"533bc2c671e1f8f70a8770c482e5a3bc99411c30a54500ad95a7f7fa083bf3ff"} Oct 09 19:41:52 crc kubenswrapper[4907]: I1009 19:41:52.135394 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:41:52 crc kubenswrapper[4907]: I1009 19:41:52.136899 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" event={"ID":"5e6d0933-34d0-4cf2-bc08-75d11b13e618","Type":"ContainerStarted","Data":"21d9405b519d92d70cf40b0b6ab6b20f2d9966b02272af7538519310478d8363"} Oct 09 19:41:52 crc kubenswrapper[4907]: I1009 19:41:52.137060 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:41:52 crc kubenswrapper[4907]: I1009 19:41:52.159830 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" podStartSLOduration=2.799678856 podStartE2EDuration="9.159812068s" podCreationTimestamp="2025-10-09 19:41:43 +0000 UTC" firstStartedPulling="2025-10-09 19:41:44.598613794 +0000 UTC m=+790.130581283" lastFinishedPulling="2025-10-09 19:41:50.958747006 +0000 UTC m=+796.490714495" observedRunningTime="2025-10-09 19:41:52.156897533 +0000 UTC m=+797.688865032" watchObservedRunningTime="2025-10-09 19:41:52.159812068 +0000 UTC m=+797.691779557" Oct 09 19:41:52 crc kubenswrapper[4907]: I1009 19:41:52.187334 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" podStartSLOduration=2.359052146 podStartE2EDuration="9.187305907s" podCreationTimestamp="2025-10-09 19:41:43 +0000 UTC" firstStartedPulling="2025-10-09 19:41:44.103194001 +0000 UTC m=+789.635161500" lastFinishedPulling="2025-10-09 19:41:50.931447762 +0000 UTC m=+796.463415261" observedRunningTime="2025-10-09 19:41:52.181391504 +0000 UTC m=+797.713359033" watchObservedRunningTime="2025-10-09 19:41:52.187305907 +0000 UTC m=+797.719273416" Oct 09 19:41:58 crc kubenswrapper[4907]: I1009 19:41:58.988300 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-98hhd"] Oct 09 19:41:58 crc kubenswrapper[4907]: I1009 19:41:58.990657 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.000985 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98hhd"] Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.063106 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-utilities\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.063152 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-catalog-content\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.063186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ssbm\" (UniqueName: \"kubernetes.io/projected/ace82cb6-9d0d-4cea-9dce-4b5650de749f-kube-api-access-7ssbm\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.164704 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-utilities\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.164971 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-catalog-content\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.165009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ssbm\" (UniqueName: \"kubernetes.io/projected/ace82cb6-9d0d-4cea-9dce-4b5650de749f-kube-api-access-7ssbm\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.165291 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-utilities\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.165443 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-catalog-content\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.185159 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ssbm\" (UniqueName: \"kubernetes.io/projected/ace82cb6-9d0d-4cea-9dce-4b5650de749f-kube-api-access-7ssbm\") pod \"community-operators-98hhd\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.306643 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:41:59 crc kubenswrapper[4907]: I1009 19:41:59.794704 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98hhd"] Oct 09 19:42:00 crc kubenswrapper[4907]: I1009 19:42:00.187929 4907 generic.go:334] "Generic (PLEG): container finished" podID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerID="120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c" exitCode=0 Oct 09 19:42:00 crc kubenswrapper[4907]: I1009 19:42:00.187971 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98hhd" event={"ID":"ace82cb6-9d0d-4cea-9dce-4b5650de749f","Type":"ContainerDied","Data":"120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c"} Oct 09 19:42:00 crc kubenswrapper[4907]: I1009 19:42:00.187994 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98hhd" event={"ID":"ace82cb6-9d0d-4cea-9dce-4b5650de749f","Type":"ContainerStarted","Data":"ab493b1ebfa16ad3a68e3adbbe6278b3fb25e575a9b1de7f4870a67279b8bf38"} Oct 09 19:42:02 crc kubenswrapper[4907]: I1009 19:42:02.205592 4907 generic.go:334] "Generic (PLEG): container finished" podID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerID="174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8" exitCode=0 Oct 09 19:42:02 crc kubenswrapper[4907]: I1009 19:42:02.205702 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98hhd" event={"ID":"ace82cb6-9d0d-4cea-9dce-4b5650de749f","Type":"ContainerDied","Data":"174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8"} Oct 09 19:42:03 crc kubenswrapper[4907]: I1009 19:42:03.214443 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98hhd" event={"ID":"ace82cb6-9d0d-4cea-9dce-4b5650de749f","Type":"ContainerStarted","Data":"99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e"} Oct 09 19:42:03 crc kubenswrapper[4907]: I1009 19:42:03.230386 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-98hhd" podStartSLOduration=2.668214694 podStartE2EDuration="5.230367962s" podCreationTimestamp="2025-10-09 19:41:58 +0000 UTC" firstStartedPulling="2025-10-09 19:42:00.189711066 +0000 UTC m=+805.721678565" lastFinishedPulling="2025-10-09 19:42:02.751864344 +0000 UTC m=+808.283831833" observedRunningTime="2025-10-09 19:42:03.228865433 +0000 UTC m=+808.760832952" watchObservedRunningTime="2025-10-09 19:42:03.230367962 +0000 UTC m=+808.762335451" Oct 09 19:42:04 crc kubenswrapper[4907]: I1009 19:42:04.120241 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5c6994cd9d-kw99q" Oct 09 19:42:09 crc kubenswrapper[4907]: I1009 19:42:09.307121 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:42:09 crc kubenswrapper[4907]: I1009 19:42:09.307542 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:42:09 crc kubenswrapper[4907]: I1009 19:42:09.369749 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:42:10 crc kubenswrapper[4907]: I1009 19:42:10.304753 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:42:10 crc kubenswrapper[4907]: I1009 19:42:10.373343 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98hhd"] Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.015000 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-64wrb"] Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.017658 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.047539 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-64wrb"] Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.139120 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-catalog-content\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.139173 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6hr4\" (UniqueName: \"kubernetes.io/projected/1944eafa-cdc2-474b-9ef1-7404557348be-kube-api-access-z6hr4\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.139194 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-utilities\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.240400 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-catalog-content\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.240455 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6hr4\" (UniqueName: \"kubernetes.io/projected/1944eafa-cdc2-474b-9ef1-7404557348be-kube-api-access-z6hr4\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.240508 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-utilities\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.240942 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-catalog-content\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.241006 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-utilities\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.264763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6hr4\" (UniqueName: \"kubernetes.io/projected/1944eafa-cdc2-474b-9ef1-7404557348be-kube-api-access-z6hr4\") pod \"certified-operators-64wrb\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.272752 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-98hhd" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerName="registry-server" containerID="cri-o://99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e" gracePeriod=2 Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.352002 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.683362 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.748396 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-catalog-content\") pod \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.748544 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ssbm\" (UniqueName: \"kubernetes.io/projected/ace82cb6-9d0d-4cea-9dce-4b5650de749f-kube-api-access-7ssbm\") pod \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.748607 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-utilities\") pod \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\" (UID: \"ace82cb6-9d0d-4cea-9dce-4b5650de749f\") " Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.749545 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-utilities" (OuterVolumeSpecName: "utilities") pod "ace82cb6-9d0d-4cea-9dce-4b5650de749f" (UID: "ace82cb6-9d0d-4cea-9dce-4b5650de749f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.754546 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace82cb6-9d0d-4cea-9dce-4b5650de749f-kube-api-access-7ssbm" (OuterVolumeSpecName: "kube-api-access-7ssbm") pod "ace82cb6-9d0d-4cea-9dce-4b5650de749f" (UID: "ace82cb6-9d0d-4cea-9dce-4b5650de749f"). InnerVolumeSpecName "kube-api-access-7ssbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.811048 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ace82cb6-9d0d-4cea-9dce-4b5650de749f" (UID: "ace82cb6-9d0d-4cea-9dce-4b5650de749f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.843817 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-64wrb"] Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.850334 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ssbm\" (UniqueName: \"kubernetes.io/projected/ace82cb6-9d0d-4cea-9dce-4b5650de749f-kube-api-access-7ssbm\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.850372 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:12 crc kubenswrapper[4907]: I1009 19:42:12.850381 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace82cb6-9d0d-4cea-9dce-4b5650de749f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.283282 4907 generic.go:334] "Generic (PLEG): container finished" podID="1944eafa-cdc2-474b-9ef1-7404557348be" containerID="ea58ecd7d09eac27fb4aaf86a18ded1535db6ae2e80f6432de993f70f034a4a1" exitCode=0 Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.283355 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64wrb" event={"ID":"1944eafa-cdc2-474b-9ef1-7404557348be","Type":"ContainerDied","Data":"ea58ecd7d09eac27fb4aaf86a18ded1535db6ae2e80f6432de993f70f034a4a1"} Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.283384 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64wrb" event={"ID":"1944eafa-cdc2-474b-9ef1-7404557348be","Type":"ContainerStarted","Data":"7346ac0806a0545ab365e16d25af284b0eb2ef5ee1bbc94406761cca3965d08a"} Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.289714 4907 generic.go:334] "Generic (PLEG): container finished" podID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerID="99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e" exitCode=0 Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.289754 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98hhd" event={"ID":"ace82cb6-9d0d-4cea-9dce-4b5650de749f","Type":"ContainerDied","Data":"99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e"} Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.289785 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98hhd" event={"ID":"ace82cb6-9d0d-4cea-9dce-4b5650de749f","Type":"ContainerDied","Data":"ab493b1ebfa16ad3a68e3adbbe6278b3fb25e575a9b1de7f4870a67279b8bf38"} Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.289812 4907 scope.go:117] "RemoveContainer" containerID="99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.289811 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98hhd" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.319067 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98hhd"] Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.324779 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-98hhd"] Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.325240 4907 scope.go:117] "RemoveContainer" containerID="174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.345115 4907 scope.go:117] "RemoveContainer" containerID="120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.371784 4907 scope.go:117] "RemoveContainer" containerID="99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e" Oct 09 19:42:13 crc kubenswrapper[4907]: E1009 19:42:13.372394 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e\": container with ID starting with 99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e not found: ID does not exist" containerID="99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.372494 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e"} err="failed to get container status \"99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e\": rpc error: code = NotFound desc = could not find container \"99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e\": container with ID starting with 99ba6d629a59bb68d0ecf42113b15b03e82e9490622079fa67315f81cd8ade4e not found: ID does not exist" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.372526 4907 scope.go:117] "RemoveContainer" containerID="174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8" Oct 09 19:42:13 crc kubenswrapper[4907]: E1009 19:42:13.373077 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8\": container with ID starting with 174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8 not found: ID does not exist" containerID="174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.373118 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8"} err="failed to get container status \"174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8\": rpc error: code = NotFound desc = could not find container \"174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8\": container with ID starting with 174931beb6e714a26980e68e15c0912140cbdb126f67480367971a3694a011f8 not found: ID does not exist" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.373144 4907 scope.go:117] "RemoveContainer" containerID="120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c" Oct 09 19:42:13 crc kubenswrapper[4907]: E1009 19:42:13.373790 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c\": container with ID starting with 120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c not found: ID does not exist" containerID="120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c" Oct 09 19:42:13 crc kubenswrapper[4907]: I1009 19:42:13.373819 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c"} err="failed to get container status \"120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c\": rpc error: code = NotFound desc = could not find container \"120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c\": container with ID starting with 120ea389bb6a90538e83d1a9b9bca37e3fc9e9d4810cdb5054b9f7c58c7f030c not found: ID does not exist" Oct 09 19:42:14 crc kubenswrapper[4907]: I1009 19:42:14.301851 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64wrb" event={"ID":"1944eafa-cdc2-474b-9ef1-7404557348be","Type":"ContainerStarted","Data":"9357c60cc431441354c05c8f477909052b8cd94818cd88914a45749e08d2bebd"} Oct 09 19:42:15 crc kubenswrapper[4907]: I1009 19:42:15.159491 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" path="/var/lib/kubelet/pods/ace82cb6-9d0d-4cea-9dce-4b5650de749f/volumes" Oct 09 19:42:15 crc kubenswrapper[4907]: I1009 19:42:15.311643 4907 generic.go:334] "Generic (PLEG): container finished" podID="1944eafa-cdc2-474b-9ef1-7404557348be" containerID="9357c60cc431441354c05c8f477909052b8cd94818cd88914a45749e08d2bebd" exitCode=0 Oct 09 19:42:15 crc kubenswrapper[4907]: I1009 19:42:15.311686 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64wrb" event={"ID":"1944eafa-cdc2-474b-9ef1-7404557348be","Type":"ContainerDied","Data":"9357c60cc431441354c05c8f477909052b8cd94818cd88914a45749e08d2bebd"} Oct 09 19:42:16 crc kubenswrapper[4907]: I1009 19:42:16.321348 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64wrb" event={"ID":"1944eafa-cdc2-474b-9ef1-7404557348be","Type":"ContainerStarted","Data":"53c41f1e44643dcf119eef29ca1b65df740e7b1d08993da290a48564a285c221"} Oct 09 19:42:16 crc kubenswrapper[4907]: I1009 19:42:16.346911 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-64wrb" podStartSLOduration=2.8567557839999997 podStartE2EDuration="5.346884955s" podCreationTimestamp="2025-10-09 19:42:11 +0000 UTC" firstStartedPulling="2025-10-09 19:42:13.285831655 +0000 UTC m=+818.817799154" lastFinishedPulling="2025-10-09 19:42:15.775960836 +0000 UTC m=+821.307928325" observedRunningTime="2025-10-09 19:42:16.344235357 +0000 UTC m=+821.876202866" watchObservedRunningTime="2025-10-09 19:42:16.346884955 +0000 UTC m=+821.878852484" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.227324 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d585w"] Oct 09 19:42:19 crc kubenswrapper[4907]: E1009 19:42:19.229984 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerName="registry-server" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.230199 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerName="registry-server" Oct 09 19:42:19 crc kubenswrapper[4907]: E1009 19:42:19.230384 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerName="extract-content" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.230592 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerName="extract-content" Oct 09 19:42:19 crc kubenswrapper[4907]: E1009 19:42:19.230778 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerName="extract-utilities" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.230937 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerName="extract-utilities" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.231353 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace82cb6-9d0d-4cea-9dce-4b5650de749f" containerName="registry-server" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.233542 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.254389 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d585w"] Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.357012 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh7hz\" (UniqueName: \"kubernetes.io/projected/26811e3a-bf9d-4ebf-bef8-650d566f6c90-kube-api-access-lh7hz\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.357086 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-utilities\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.357289 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-catalog-content\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.459514 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh7hz\" (UniqueName: \"kubernetes.io/projected/26811e3a-bf9d-4ebf-bef8-650d566f6c90-kube-api-access-lh7hz\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.459585 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-utilities\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.459626 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-catalog-content\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.460137 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-catalog-content\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.460773 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-utilities\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.480024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh7hz\" (UniqueName: \"kubernetes.io/projected/26811e3a-bf9d-4ebf-bef8-650d566f6c90-kube-api-access-lh7hz\") pod \"redhat-marketplace-d585w\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.558084 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:19 crc kubenswrapper[4907]: I1009 19:42:19.974688 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d585w"] Oct 09 19:42:19 crc kubenswrapper[4907]: W1009 19:42:19.981350 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26811e3a_bf9d_4ebf_bef8_650d566f6c90.slice/crio-c1bc38c450af71ad75d4ba14051bf4ee76b96ddcdfe5c1267ea35af05dc17338 WatchSource:0}: Error finding container c1bc38c450af71ad75d4ba14051bf4ee76b96ddcdfe5c1267ea35af05dc17338: Status 404 returned error can't find the container with id c1bc38c450af71ad75d4ba14051bf4ee76b96ddcdfe5c1267ea35af05dc17338 Oct 09 19:42:20 crc kubenswrapper[4907]: I1009 19:42:20.350008 4907 generic.go:334] "Generic (PLEG): container finished" podID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerID="87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796" exitCode=0 Oct 09 19:42:20 crc kubenswrapper[4907]: I1009 19:42:20.350126 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d585w" event={"ID":"26811e3a-bf9d-4ebf-bef8-650d566f6c90","Type":"ContainerDied","Data":"87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796"} Oct 09 19:42:20 crc kubenswrapper[4907]: I1009 19:42:20.350588 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d585w" event={"ID":"26811e3a-bf9d-4ebf-bef8-650d566f6c90","Type":"ContainerStarted","Data":"c1bc38c450af71ad75d4ba14051bf4ee76b96ddcdfe5c1267ea35af05dc17338"} Oct 09 19:42:22 crc kubenswrapper[4907]: I1009 19:42:22.352493 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:22 crc kubenswrapper[4907]: I1009 19:42:22.352927 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:22 crc kubenswrapper[4907]: I1009 19:42:22.364879 4907 generic.go:334] "Generic (PLEG): container finished" podID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerID="9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961" exitCode=0 Oct 09 19:42:22 crc kubenswrapper[4907]: I1009 19:42:22.365321 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d585w" event={"ID":"26811e3a-bf9d-4ebf-bef8-650d566f6c90","Type":"ContainerDied","Data":"9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961"} Oct 09 19:42:22 crc kubenswrapper[4907]: I1009 19:42:22.416097 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:23 crc kubenswrapper[4907]: I1009 19:42:23.375642 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d585w" event={"ID":"26811e3a-bf9d-4ebf-bef8-650d566f6c90","Type":"ContainerStarted","Data":"d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4"} Oct 09 19:42:23 crc kubenswrapper[4907]: I1009 19:42:23.406748 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d585w" podStartSLOduration=1.714521596 podStartE2EDuration="4.406725513s" podCreationTimestamp="2025-10-09 19:42:19 +0000 UTC" firstStartedPulling="2025-10-09 19:42:20.352758186 +0000 UTC m=+825.884725715" lastFinishedPulling="2025-10-09 19:42:23.044962103 +0000 UTC m=+828.576929632" observedRunningTime="2025-10-09 19:42:23.406389234 +0000 UTC m=+828.938356813" watchObservedRunningTime="2025-10-09 19:42:23.406725513 +0000 UTC m=+828.938693012" Oct 09 19:42:23 crc kubenswrapper[4907]: I1009 19:42:23.443699 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:23 crc kubenswrapper[4907]: I1009 19:42:23.797795 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-55785946f8-t68kr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.541755 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr"] Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.542484 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.544263 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-rmjt4" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.546335 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-km469"] Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.547334 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.548628 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.553368 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.553592 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.576640 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr"] Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.618134 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-m6wfh"] Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.619061 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.620947 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vnqgn" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.621114 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.621280 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.621432 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.632744 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-conf\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.632786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-reloader\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.632809 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdk7f\" (UniqueName: \"kubernetes.io/projected/98b09a68-cabf-431c-9885-8f6e36c84de6-kube-api-access-vdk7f\") pod \"frr-k8s-webhook-server-64bf5d555-2z2vr\" (UID: \"98b09a68-cabf-431c-9885-8f6e36c84de6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.632851 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj8sb\" (UniqueName: \"kubernetes.io/projected/ba53143e-9c68-4c9f-89b8-45429be8e899-kube-api-access-gj8sb\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.632875 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.633015 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-startup\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.633113 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-sockets\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.633171 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics-certs\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.633241 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b09a68-cabf-431c-9885-8f6e36c84de6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-2z2vr\" (UID: \"98b09a68-cabf-431c-9885-8f6e36c84de6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.654643 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-6sl6d"] Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.655518 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.657697 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.662861 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-6sl6d"] Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734674 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42cf9557-7cae-41c0-bbaa-a3baa099e36c-cert\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj8sb\" (UniqueName: \"kubernetes.io/projected/ba53143e-9c68-4c9f-89b8-45429be8e899-kube-api-access-gj8sb\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734761 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tjv8\" (UniqueName: \"kubernetes.io/projected/42cf9557-7cae-41c0-bbaa-a3baa099e36c-kube-api-access-7tjv8\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734785 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-metallb-excludel2\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734826 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-startup\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734844 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42cf9557-7cae-41c0-bbaa-a3baa099e36c-metrics-certs\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734864 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-sockets\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734879 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics-certs\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b09a68-cabf-431c-9885-8f6e36c84de6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-2z2vr\" (UID: \"98b09a68-cabf-431c-9885-8f6e36c84de6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734924 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734949 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-conf\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734966 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-reloader\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.734985 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdk7f\" (UniqueName: \"kubernetes.io/projected/98b09a68-cabf-431c-9885-8f6e36c84de6-kube-api-access-vdk7f\") pod \"frr-k8s-webhook-server-64bf5d555-2z2vr\" (UID: \"98b09a68-cabf-431c-9885-8f6e36c84de6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.735003 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-metrics-certs\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.735029 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b54qc\" (UniqueName: \"kubernetes.io/projected/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-kube-api-access-b54qc\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.735648 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.736376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-startup\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.736606 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-sockets\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: E1009 19:42:24.736666 4907 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 09 19:42:24 crc kubenswrapper[4907]: E1009 19:42:24.736705 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics-certs podName:ba53143e-9c68-4c9f-89b8-45429be8e899 nodeName:}" failed. No retries permitted until 2025-10-09 19:42:25.236693676 +0000 UTC m=+830.768661165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics-certs") pod "frr-k8s-km469" (UID: "ba53143e-9c68-4c9f-89b8-45429be8e899") : secret "frr-k8s-certs-secret" not found Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.737741 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-reloader\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.737965 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ba53143e-9c68-4c9f-89b8-45429be8e899-frr-conf\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.742888 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b09a68-cabf-431c-9885-8f6e36c84de6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-2z2vr\" (UID: \"98b09a68-cabf-431c-9885-8f6e36c84de6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.752250 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj8sb\" (UniqueName: \"kubernetes.io/projected/ba53143e-9c68-4c9f-89b8-45429be8e899-kube-api-access-gj8sb\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.756130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdk7f\" (UniqueName: \"kubernetes.io/projected/98b09a68-cabf-431c-9885-8f6e36c84de6-kube-api-access-vdk7f\") pod \"frr-k8s-webhook-server-64bf5d555-2z2vr\" (UID: \"98b09a68-cabf-431c-9885-8f6e36c84de6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.797274 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-64wrb"] Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.836305 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42cf9557-7cae-41c0-bbaa-a3baa099e36c-metrics-certs\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.836385 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.836430 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-metrics-certs\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.836485 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b54qc\" (UniqueName: \"kubernetes.io/projected/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-kube-api-access-b54qc\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.836509 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42cf9557-7cae-41c0-bbaa-a3baa099e36c-cert\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: E1009 19:42:24.836524 4907 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.836563 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tjv8\" (UniqueName: \"kubernetes.io/projected/42cf9557-7cae-41c0-bbaa-a3baa099e36c-kube-api-access-7tjv8\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.836591 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-metallb-excludel2\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: E1009 19:42:24.836608 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist podName:db2ff9e3-9b97-4ec9-8b10-c782c7784b8f nodeName:}" failed. No retries permitted until 2025-10-09 19:42:25.33659165 +0000 UTC m=+830.868559139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist") pod "speaker-m6wfh" (UID: "db2ff9e3-9b97-4ec9-8b10-c782c7784b8f") : secret "metallb-memberlist" not found Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.837349 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-metallb-excludel2\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.839904 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42cf9557-7cae-41c0-bbaa-a3baa099e36c-metrics-certs\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.839938 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42cf9557-7cae-41c0-bbaa-a3baa099e36c-cert\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.841013 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-metrics-certs\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.854307 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tjv8\" (UniqueName: \"kubernetes.io/projected/42cf9557-7cae-41c0-bbaa-a3baa099e36c-kube-api-access-7tjv8\") pod \"controller-68d546b9d8-6sl6d\" (UID: \"42cf9557-7cae-41c0-bbaa-a3baa099e36c\") " pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.855616 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b54qc\" (UniqueName: \"kubernetes.io/projected/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-kube-api-access-b54qc\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.867690 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:24 crc kubenswrapper[4907]: I1009 19:42:24.969095 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:25 crc kubenswrapper[4907]: I1009 19:42:25.241906 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics-certs\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:25 crc kubenswrapper[4907]: I1009 19:42:25.251120 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba53143e-9c68-4c9f-89b8-45429be8e899-metrics-certs\") pod \"frr-k8s-km469\" (UID: \"ba53143e-9c68-4c9f-89b8-45429be8e899\") " pod="metallb-system/frr-k8s-km469" Oct 09 19:42:25 crc kubenswrapper[4907]: I1009 19:42:25.276632 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr"] Oct 09 19:42:25 crc kubenswrapper[4907]: W1009 19:42:25.282983 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98b09a68_cabf_431c_9885_8f6e36c84de6.slice/crio-0b239b98f5f3f69aab35b96c5310d0bcae66a9a6202952f768523b6b5cea6195 WatchSource:0}: Error finding container 0b239b98f5f3f69aab35b96c5310d0bcae66a9a6202952f768523b6b5cea6195: Status 404 returned error can't find the container with id 0b239b98f5f3f69aab35b96c5310d0bcae66a9a6202952f768523b6b5cea6195 Oct 09 19:42:25 crc kubenswrapper[4907]: I1009 19:42:25.348322 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:25 crc kubenswrapper[4907]: E1009 19:42:25.348505 4907 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 19:42:25 crc kubenswrapper[4907]: E1009 19:42:25.348816 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist podName:db2ff9e3-9b97-4ec9-8b10-c782c7784b8f nodeName:}" failed. No retries permitted until 2025-10-09 19:42:26.348780934 +0000 UTC m=+831.880748423 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist") pod "speaker-m6wfh" (UID: "db2ff9e3-9b97-4ec9-8b10-c782c7784b8f") : secret "metallb-memberlist" not found Oct 09 19:42:25 crc kubenswrapper[4907]: I1009 19:42:25.392671 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-6sl6d"] Oct 09 19:42:25 crc kubenswrapper[4907]: I1009 19:42:25.395950 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" event={"ID":"98b09a68-cabf-431c-9885-8f6e36c84de6","Type":"ContainerStarted","Data":"0b239b98f5f3f69aab35b96c5310d0bcae66a9a6202952f768523b6b5cea6195"} Oct 09 19:42:25 crc kubenswrapper[4907]: I1009 19:42:25.396168 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-64wrb" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" containerName="registry-server" containerID="cri-o://53c41f1e44643dcf119eef29ca1b65df740e7b1d08993da290a48564a285c221" gracePeriod=2 Oct 09 19:42:25 crc kubenswrapper[4907]: W1009 19:42:25.397091 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42cf9557_7cae_41c0_bbaa_a3baa099e36c.slice/crio-48b6ea9bffdf45c6e21559e6b85a90cb91da27b1d4cc11104f5e6cb71b3bde8c WatchSource:0}: Error finding container 48b6ea9bffdf45c6e21559e6b85a90cb91da27b1d4cc11104f5e6cb71b3bde8c: Status 404 returned error can't find the container with id 48b6ea9bffdf45c6e21559e6b85a90cb91da27b1d4cc11104f5e6cb71b3bde8c Oct 09 19:42:25 crc kubenswrapper[4907]: I1009 19:42:25.474987 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-km469" Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.362511 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.369026 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/db2ff9e3-9b97-4ec9-8b10-c782c7784b8f-memberlist\") pod \"speaker-m6wfh\" (UID: \"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f\") " pod="metallb-system/speaker-m6wfh" Oct 09 19:42:26 crc kubenswrapper[4907]: E1009 19:42:26.376899 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1944eafa_cdc2_474b_9ef1_7404557348be.slice/crio-conmon-53c41f1e44643dcf119eef29ca1b65df740e7b1d08993da290a48564a285c221.scope\": RecentStats: unable to find data in memory cache]" Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.406562 4907 generic.go:334] "Generic (PLEG): container finished" podID="1944eafa-cdc2-474b-9ef1-7404557348be" containerID="53c41f1e44643dcf119eef29ca1b65df740e7b1d08993da290a48564a285c221" exitCode=0 Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.406658 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64wrb" event={"ID":"1944eafa-cdc2-474b-9ef1-7404557348be","Type":"ContainerDied","Data":"53c41f1e44643dcf119eef29ca1b65df740e7b1d08993da290a48564a285c221"} Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.407940 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6sl6d" event={"ID":"42cf9557-7cae-41c0-bbaa-a3baa099e36c","Type":"ContainerStarted","Data":"dec9cf43365346f7d0e32aa752ec4418b5d661c54b580b3978ce2f2a1714544e"} Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.407968 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6sl6d" event={"ID":"42cf9557-7cae-41c0-bbaa-a3baa099e36c","Type":"ContainerStarted","Data":"23ddd78c55697ba74cdfa6fb56daa810c70210e43b679a412f4fda05f5d0f0a3"} Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.407978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-6sl6d" event={"ID":"42cf9557-7cae-41c0-bbaa-a3baa099e36c","Type":"ContainerStarted","Data":"48b6ea9bffdf45c6e21559e6b85a90cb91da27b1d4cc11104f5e6cb71b3bde8c"} Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.408788 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.411114 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerStarted","Data":"d4dacfc612843407d206269552f8b74963d8e8084d9d9e9208a46c301a48026b"} Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.433249 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m6wfh" Oct 09 19:42:26 crc kubenswrapper[4907]: I1009 19:42:26.433328 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-6sl6d" podStartSLOduration=2.433310775 podStartE2EDuration="2.433310775s" podCreationTimestamp="2025-10-09 19:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:42:26.428714516 +0000 UTC m=+831.960682015" watchObservedRunningTime="2025-10-09 19:42:26.433310775 +0000 UTC m=+831.965278264" Oct 09 19:42:26 crc kubenswrapper[4907]: W1009 19:42:26.452516 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb2ff9e3_9b97_4ec9_8b10_c782c7784b8f.slice/crio-4694e6102aa12194b4bd6ba4041ad3b2e8963c05d81f7bbe26efa1fdabcc29f4 WatchSource:0}: Error finding container 4694e6102aa12194b4bd6ba4041ad3b2e8963c05d81f7bbe26efa1fdabcc29f4: Status 404 returned error can't find the container with id 4694e6102aa12194b4bd6ba4041ad3b2e8963c05d81f7bbe26efa1fdabcc29f4 Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.114351 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.172939 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-utilities\") pod \"1944eafa-cdc2-474b-9ef1-7404557348be\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.173037 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-catalog-content\") pod \"1944eafa-cdc2-474b-9ef1-7404557348be\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.173058 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6hr4\" (UniqueName: \"kubernetes.io/projected/1944eafa-cdc2-474b-9ef1-7404557348be-kube-api-access-z6hr4\") pod \"1944eafa-cdc2-474b-9ef1-7404557348be\" (UID: \"1944eafa-cdc2-474b-9ef1-7404557348be\") " Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.173998 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-utilities" (OuterVolumeSpecName: "utilities") pod "1944eafa-cdc2-474b-9ef1-7404557348be" (UID: "1944eafa-cdc2-474b-9ef1-7404557348be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.191274 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1944eafa-cdc2-474b-9ef1-7404557348be-kube-api-access-z6hr4" (OuterVolumeSpecName: "kube-api-access-z6hr4") pod "1944eafa-cdc2-474b-9ef1-7404557348be" (UID: "1944eafa-cdc2-474b-9ef1-7404557348be"). InnerVolumeSpecName "kube-api-access-z6hr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.258757 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1944eafa-cdc2-474b-9ef1-7404557348be" (UID: "1944eafa-cdc2-474b-9ef1-7404557348be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.274015 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.274053 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1944eafa-cdc2-474b-9ef1-7404557348be-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.274065 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6hr4\" (UniqueName: \"kubernetes.io/projected/1944eafa-cdc2-474b-9ef1-7404557348be-kube-api-access-z6hr4\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.419968 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64wrb" event={"ID":"1944eafa-cdc2-474b-9ef1-7404557348be","Type":"ContainerDied","Data":"7346ac0806a0545ab365e16d25af284b0eb2ef5ee1bbc94406761cca3965d08a"} Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.420020 4907 scope.go:117] "RemoveContainer" containerID="53c41f1e44643dcf119eef29ca1b65df740e7b1d08993da290a48564a285c221" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.420143 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64wrb" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.431631 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m6wfh" event={"ID":"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f","Type":"ContainerStarted","Data":"4f64035cdfef31bd38b78963927ceec9dfe0c724b4955677c48e4c3010446f7a"} Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.431667 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m6wfh" event={"ID":"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f","Type":"ContainerStarted","Data":"3357c16c6f1ba46f692524fc7ba0009762349db86f111ed2aebef299d93ca7e2"} Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.431678 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m6wfh" event={"ID":"db2ff9e3-9b97-4ec9-8b10-c782c7784b8f","Type":"ContainerStarted","Data":"4694e6102aa12194b4bd6ba4041ad3b2e8963c05d81f7bbe26efa1fdabcc29f4"} Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.432169 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-m6wfh" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.448755 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-m6wfh" podStartSLOduration=3.4487383449999998 podStartE2EDuration="3.448738345s" podCreationTimestamp="2025-10-09 19:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:42:27.447084092 +0000 UTC m=+832.979051581" watchObservedRunningTime="2025-10-09 19:42:27.448738345 +0000 UTC m=+832.980705824" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.462281 4907 scope.go:117] "RemoveContainer" containerID="9357c60cc431441354c05c8f477909052b8cd94818cd88914a45749e08d2bebd" Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.484616 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-64wrb"] Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.492960 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-64wrb"] Oct 09 19:42:27 crc kubenswrapper[4907]: I1009 19:42:27.496334 4907 scope.go:117] "RemoveContainer" containerID="ea58ecd7d09eac27fb4aaf86a18ded1535db6ae2e80f6432de993f70f034a4a1" Oct 09 19:42:29 crc kubenswrapper[4907]: I1009 19:42:29.161824 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" path="/var/lib/kubelet/pods/1944eafa-cdc2-474b-9ef1-7404557348be/volumes" Oct 09 19:42:29 crc kubenswrapper[4907]: I1009 19:42:29.558851 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:29 crc kubenswrapper[4907]: I1009 19:42:29.558922 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:29 crc kubenswrapper[4907]: I1009 19:42:29.616957 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:30 crc kubenswrapper[4907]: I1009 19:42:30.526669 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:30 crc kubenswrapper[4907]: I1009 19:42:30.799849 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d585w"] Oct 09 19:42:32 crc kubenswrapper[4907]: I1009 19:42:32.481981 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d585w" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerName="registry-server" containerID="cri-o://d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4" gracePeriod=2 Oct 09 19:42:32 crc kubenswrapper[4907]: I1009 19:42:32.906176 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:32 crc kubenswrapper[4907]: I1009 19:42:32.971652 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-utilities\") pod \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " Oct 09 19:42:32 crc kubenswrapper[4907]: I1009 19:42:32.971712 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-catalog-content\") pod \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " Oct 09 19:42:32 crc kubenswrapper[4907]: I1009 19:42:32.971745 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh7hz\" (UniqueName: \"kubernetes.io/projected/26811e3a-bf9d-4ebf-bef8-650d566f6c90-kube-api-access-lh7hz\") pod \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\" (UID: \"26811e3a-bf9d-4ebf-bef8-650d566f6c90\") " Oct 09 19:42:32 crc kubenswrapper[4907]: I1009 19:42:32.972659 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-utilities" (OuterVolumeSpecName: "utilities") pod "26811e3a-bf9d-4ebf-bef8-650d566f6c90" (UID: "26811e3a-bf9d-4ebf-bef8-650d566f6c90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:42:32 crc kubenswrapper[4907]: I1009 19:42:32.976994 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26811e3a-bf9d-4ebf-bef8-650d566f6c90-kube-api-access-lh7hz" (OuterVolumeSpecName: "kube-api-access-lh7hz") pod "26811e3a-bf9d-4ebf-bef8-650d566f6c90" (UID: "26811e3a-bf9d-4ebf-bef8-650d566f6c90"). InnerVolumeSpecName "kube-api-access-lh7hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:42:32 crc kubenswrapper[4907]: I1009 19:42:32.998059 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26811e3a-bf9d-4ebf-bef8-650d566f6c90" (UID: "26811e3a-bf9d-4ebf-bef8-650d566f6c90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.073396 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.073554 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26811e3a-bf9d-4ebf-bef8-650d566f6c90-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.073579 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh7hz\" (UniqueName: \"kubernetes.io/projected/26811e3a-bf9d-4ebf-bef8-650d566f6c90-kube-api-access-lh7hz\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.490442 4907 generic.go:334] "Generic (PLEG): container finished" podID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerID="d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4" exitCode=0 Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.490557 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d585w" event={"ID":"26811e3a-bf9d-4ebf-bef8-650d566f6c90","Type":"ContainerDied","Data":"d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4"} Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.490581 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d585w" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.490609 4907 scope.go:117] "RemoveContainer" containerID="d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.490596 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d585w" event={"ID":"26811e3a-bf9d-4ebf-bef8-650d566f6c90","Type":"ContainerDied","Data":"c1bc38c450af71ad75d4ba14051bf4ee76b96ddcdfe5c1267ea35af05dc17338"} Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.492727 4907 generic.go:334] "Generic (PLEG): container finished" podID="ba53143e-9c68-4c9f-89b8-45429be8e899" containerID="ef6b17d42adaa133d98ee40380baee53ea2243d34992838a20ca475a57576bc9" exitCode=0 Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.492829 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerDied","Data":"ef6b17d42adaa133d98ee40380baee53ea2243d34992838a20ca475a57576bc9"} Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.495623 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" event={"ID":"98b09a68-cabf-431c-9885-8f6e36c84de6","Type":"ContainerStarted","Data":"c0e25e7220b05ece4597b3c782a2a35013ab3861a62a2be18332fdd9968e2b07"} Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.495793 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.516267 4907 scope.go:117] "RemoveContainer" containerID="9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.544577 4907 scope.go:117] "RemoveContainer" containerID="87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.557432 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" podStartSLOduration=2.26778018 podStartE2EDuration="9.557406568s" podCreationTimestamp="2025-10-09 19:42:24 +0000 UTC" firstStartedPulling="2025-10-09 19:42:25.285454933 +0000 UTC m=+830.817422422" lastFinishedPulling="2025-10-09 19:42:32.575081321 +0000 UTC m=+838.107048810" observedRunningTime="2025-10-09 19:42:33.55323105 +0000 UTC m=+839.085198559" watchObservedRunningTime="2025-10-09 19:42:33.557406568 +0000 UTC m=+839.089374067" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.570747 4907 scope.go:117] "RemoveContainer" containerID="d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4" Oct 09 19:42:33 crc kubenswrapper[4907]: E1009 19:42:33.571253 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4\": container with ID starting with d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4 not found: ID does not exist" containerID="d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.571318 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4"} err="failed to get container status \"d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4\": rpc error: code = NotFound desc = could not find container \"d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4\": container with ID starting with d081da57a22c9d8816120e309b03e622f8331681276f5e3dff0d71bc0fc9e7f4 not found: ID does not exist" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.571349 4907 scope.go:117] "RemoveContainer" containerID="9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961" Oct 09 19:42:33 crc kubenswrapper[4907]: E1009 19:42:33.571917 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961\": container with ID starting with 9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961 not found: ID does not exist" containerID="9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.571959 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961"} err="failed to get container status \"9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961\": rpc error: code = NotFound desc = could not find container \"9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961\": container with ID starting with 9c0b720212cf7852a49ae32822af800a2c365a4d9e66d1d6292613618d623961 not found: ID does not exist" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.571977 4907 scope.go:117] "RemoveContainer" containerID="87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796" Oct 09 19:42:33 crc kubenswrapper[4907]: E1009 19:42:33.572371 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796\": container with ID starting with 87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796 not found: ID does not exist" containerID="87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.572408 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796"} err="failed to get container status \"87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796\": rpc error: code = NotFound desc = could not find container \"87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796\": container with ID starting with 87e2846695aee4ea57c9235a2b2b15f15227823098a8c5b5297481881395e796 not found: ID does not exist" Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.572715 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d585w"] Oct 09 19:42:33 crc kubenswrapper[4907]: I1009 19:42:33.577929 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d585w"] Oct 09 19:42:34 crc kubenswrapper[4907]: I1009 19:42:34.507637 4907 generic.go:334] "Generic (PLEG): container finished" podID="ba53143e-9c68-4c9f-89b8-45429be8e899" containerID="af2c4d6bbfdb8a7e522933b75371feac3e9e0155ee5c13b59c5377a37876e486" exitCode=0 Oct 09 19:42:34 crc kubenswrapper[4907]: I1009 19:42:34.507745 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerDied","Data":"af2c4d6bbfdb8a7e522933b75371feac3e9e0155ee5c13b59c5377a37876e486"} Oct 09 19:42:35 crc kubenswrapper[4907]: I1009 19:42:35.162400 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" path="/var/lib/kubelet/pods/26811e3a-bf9d-4ebf-bef8-650d566f6c90/volumes" Oct 09 19:42:35 crc kubenswrapper[4907]: I1009 19:42:35.520343 4907 generic.go:334] "Generic (PLEG): container finished" podID="ba53143e-9c68-4c9f-89b8-45429be8e899" containerID="a9ca33b37f2708ac9258e6438480c8b90a331d480f7b64f8e133496b2c335d4f" exitCode=0 Oct 09 19:42:35 crc kubenswrapper[4907]: I1009 19:42:35.520446 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerDied","Data":"a9ca33b37f2708ac9258e6438480c8b90a331d480f7b64f8e133496b2c335d4f"} Oct 09 19:42:36 crc kubenswrapper[4907]: I1009 19:42:36.438328 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-m6wfh" Oct 09 19:42:36 crc kubenswrapper[4907]: I1009 19:42:36.552455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerStarted","Data":"273beacd8213274657a05a5b0b5ca02fb31dedd7bb365ab251ad2e2f497e8062"} Oct 09 19:42:36 crc kubenswrapper[4907]: I1009 19:42:36.552553 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerStarted","Data":"9c6d97918673534ef965e7792e3134d97ab1a9f8859b12445ab8711dabacfb33"} Oct 09 19:42:36 crc kubenswrapper[4907]: I1009 19:42:36.552573 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerStarted","Data":"d2e2a1d4ae3813c2bbf925906fe39a9211a5d481cae858cf8d8db74265b9fc22"} Oct 09 19:42:36 crc kubenswrapper[4907]: I1009 19:42:36.552591 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerStarted","Data":"15714583a155dea00e89d9c4050cd9084013e134fe1ce29121b644a6272a3d6f"} Oct 09 19:42:36 crc kubenswrapper[4907]: I1009 19:42:36.552608 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerStarted","Data":"09dd74cfffc9bd3d14644e55638510e9784b264c1a3736da1c0e1ad2da8fd615"} Oct 09 19:42:37 crc kubenswrapper[4907]: I1009 19:42:37.563789 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-km469" event={"ID":"ba53143e-9c68-4c9f-89b8-45429be8e899","Type":"ContainerStarted","Data":"882960c7e432277634f7727a03ca02f5fbed133cad0398d2e75429a5a2e590a4"} Oct 09 19:42:37 crc kubenswrapper[4907]: I1009 19:42:37.564937 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-km469" Oct 09 19:42:37 crc kubenswrapper[4907]: I1009 19:42:37.588900 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-km469" podStartSLOduration=6.568679512 podStartE2EDuration="13.588876158s" podCreationTimestamp="2025-10-09 19:42:24 +0000 UTC" firstStartedPulling="2025-10-09 19:42:25.577847256 +0000 UTC m=+831.109814755" lastFinishedPulling="2025-10-09 19:42:32.598043902 +0000 UTC m=+838.130011401" observedRunningTime="2025-10-09 19:42:37.587053571 +0000 UTC m=+843.119021100" watchObservedRunningTime="2025-10-09 19:42:37.588876158 +0000 UTC m=+843.120843657" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310458 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dzmbg"] Oct 09 19:42:39 crc kubenswrapper[4907]: E1009 19:42:39.310700 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerName="extract-utilities" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310712 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerName="extract-utilities" Oct 09 19:42:39 crc kubenswrapper[4907]: E1009 19:42:39.310726 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerName="extract-content" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310731 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerName="extract-content" Oct 09 19:42:39 crc kubenswrapper[4907]: E1009 19:42:39.310740 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerName="registry-server" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310746 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerName="registry-server" Oct 09 19:42:39 crc kubenswrapper[4907]: E1009 19:42:39.310766 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" containerName="extract-content" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310772 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" containerName="extract-content" Oct 09 19:42:39 crc kubenswrapper[4907]: E1009 19:42:39.310782 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" containerName="registry-server" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310787 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" containerName="registry-server" Oct 09 19:42:39 crc kubenswrapper[4907]: E1009 19:42:39.310808 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" containerName="extract-utilities" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310813 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" containerName="extract-utilities" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310920 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="26811e3a-bf9d-4ebf-bef8-650d566f6c90" containerName="registry-server" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.310934 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1944eafa-cdc2-474b-9ef1-7404557348be" containerName="registry-server" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.311312 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dzmbg" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.315190 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-tcw98" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.317234 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.317839 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.340808 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dzmbg"] Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.367341 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7pvj\" (UniqueName: \"kubernetes.io/projected/ef3c7ba4-eca6-4984-9f88-1ce1632970ca-kube-api-access-g7pvj\") pod \"openstack-operator-index-dzmbg\" (UID: \"ef3c7ba4-eca6-4984-9f88-1ce1632970ca\") " pod="openstack-operators/openstack-operator-index-dzmbg" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.468974 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7pvj\" (UniqueName: \"kubernetes.io/projected/ef3c7ba4-eca6-4984-9f88-1ce1632970ca-kube-api-access-g7pvj\") pod \"openstack-operator-index-dzmbg\" (UID: \"ef3c7ba4-eca6-4984-9f88-1ce1632970ca\") " pod="openstack-operators/openstack-operator-index-dzmbg" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.492150 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7pvj\" (UniqueName: \"kubernetes.io/projected/ef3c7ba4-eca6-4984-9f88-1ce1632970ca-kube-api-access-g7pvj\") pod \"openstack-operator-index-dzmbg\" (UID: \"ef3c7ba4-eca6-4984-9f88-1ce1632970ca\") " pod="openstack-operators/openstack-operator-index-dzmbg" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.629862 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dzmbg" Oct 09 19:42:39 crc kubenswrapper[4907]: I1009 19:42:39.831457 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dzmbg"] Oct 09 19:42:39 crc kubenswrapper[4907]: W1009 19:42:39.835662 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef3c7ba4_eca6_4984_9f88_1ce1632970ca.slice/crio-d3f10b18409ccc4fa447e53eb30f54ab9a5e153b26d3e6a43ec54b38620e748c WatchSource:0}: Error finding container d3f10b18409ccc4fa447e53eb30f54ab9a5e153b26d3e6a43ec54b38620e748c: Status 404 returned error can't find the container with id d3f10b18409ccc4fa447e53eb30f54ab9a5e153b26d3e6a43ec54b38620e748c Oct 09 19:42:40 crc kubenswrapper[4907]: I1009 19:42:40.476445 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-km469" Oct 09 19:42:40 crc kubenswrapper[4907]: I1009 19:42:40.514202 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-km469" Oct 09 19:42:40 crc kubenswrapper[4907]: I1009 19:42:40.582511 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dzmbg" event={"ID":"ef3c7ba4-eca6-4984-9f88-1ce1632970ca","Type":"ContainerStarted","Data":"d3f10b18409ccc4fa447e53eb30f54ab9a5e153b26d3e6a43ec54b38620e748c"} Oct 09 19:42:42 crc kubenswrapper[4907]: I1009 19:42:42.598592 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dzmbg" event={"ID":"ef3c7ba4-eca6-4984-9f88-1ce1632970ca","Type":"ContainerStarted","Data":"23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff"} Oct 09 19:42:42 crc kubenswrapper[4907]: I1009 19:42:42.617052 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dzmbg" podStartSLOduration=1.430127185 podStartE2EDuration="3.617030915s" podCreationTimestamp="2025-10-09 19:42:39 +0000 UTC" firstStartedPulling="2025-10-09 19:42:39.838083943 +0000 UTC m=+845.370051432" lastFinishedPulling="2025-10-09 19:42:42.024987663 +0000 UTC m=+847.556955162" observedRunningTime="2025-10-09 19:42:42.614279134 +0000 UTC m=+848.146246623" watchObservedRunningTime="2025-10-09 19:42:42.617030915 +0000 UTC m=+848.148998414" Oct 09 19:42:42 crc kubenswrapper[4907]: I1009 19:42:42.893177 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dzmbg"] Oct 09 19:42:43 crc kubenswrapper[4907]: I1009 19:42:43.734409 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9pktm"] Oct 09 19:42:43 crc kubenswrapper[4907]: I1009 19:42:43.736071 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:42:43 crc kubenswrapper[4907]: I1009 19:42:43.751031 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9pktm"] Oct 09 19:42:43 crc kubenswrapper[4907]: I1009 19:42:43.837175 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwtb4\" (UniqueName: \"kubernetes.io/projected/abdc2315-d020-4dc6-901d-75db1c33254f-kube-api-access-bwtb4\") pod \"openstack-operator-index-9pktm\" (UID: \"abdc2315-d020-4dc6-901d-75db1c33254f\") " pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:42:43 crc kubenswrapper[4907]: I1009 19:42:43.939038 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwtb4\" (UniqueName: \"kubernetes.io/projected/abdc2315-d020-4dc6-901d-75db1c33254f-kube-api-access-bwtb4\") pod \"openstack-operator-index-9pktm\" (UID: \"abdc2315-d020-4dc6-901d-75db1c33254f\") " pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:42:43 crc kubenswrapper[4907]: I1009 19:42:43.971195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwtb4\" (UniqueName: \"kubernetes.io/projected/abdc2315-d020-4dc6-901d-75db1c33254f-kube-api-access-bwtb4\") pod \"openstack-operator-index-9pktm\" (UID: \"abdc2315-d020-4dc6-901d-75db1c33254f\") " pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:42:44 crc kubenswrapper[4907]: I1009 19:42:44.058817 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:42:44 crc kubenswrapper[4907]: I1009 19:42:44.565644 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9pktm"] Oct 09 19:42:44 crc kubenswrapper[4907]: I1009 19:42:44.614401 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9pktm" event={"ID":"abdc2315-d020-4dc6-901d-75db1c33254f","Type":"ContainerStarted","Data":"be67a2ffe2ac630fa32e364258362c6fdcf00bf7e15639f34ce1f2024af9e206"} Oct 09 19:42:44 crc kubenswrapper[4907]: I1009 19:42:44.614554 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dzmbg" podUID="ef3c7ba4-eca6-4984-9f88-1ce1632970ca" containerName="registry-server" containerID="cri-o://23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff" gracePeriod=2 Oct 09 19:42:44 crc kubenswrapper[4907]: I1009 19:42:44.873770 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-2z2vr" Oct 09 19:42:44 crc kubenswrapper[4907]: I1009 19:42:44.973518 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-6sl6d" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.014893 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dzmbg" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.156119 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7pvj\" (UniqueName: \"kubernetes.io/projected/ef3c7ba4-eca6-4984-9f88-1ce1632970ca-kube-api-access-g7pvj\") pod \"ef3c7ba4-eca6-4984-9f88-1ce1632970ca\" (UID: \"ef3c7ba4-eca6-4984-9f88-1ce1632970ca\") " Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.164966 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3c7ba4-eca6-4984-9f88-1ce1632970ca-kube-api-access-g7pvj" (OuterVolumeSpecName: "kube-api-access-g7pvj") pod "ef3c7ba4-eca6-4984-9f88-1ce1632970ca" (UID: "ef3c7ba4-eca6-4984-9f88-1ce1632970ca"). InnerVolumeSpecName "kube-api-access-g7pvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.257696 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7pvj\" (UniqueName: \"kubernetes.io/projected/ef3c7ba4-eca6-4984-9f88-1ce1632970ca-kube-api-access-g7pvj\") on node \"crc\" DevicePath \"\"" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.482113 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-km469" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.626437 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9pktm" event={"ID":"abdc2315-d020-4dc6-901d-75db1c33254f","Type":"ContainerStarted","Data":"fdf6d558b90c961d4c529d51498a7526badd3aa7d05063a9143be430de151fc5"} Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.629414 4907 generic.go:334] "Generic (PLEG): container finished" podID="ef3c7ba4-eca6-4984-9f88-1ce1632970ca" containerID="23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff" exitCode=0 Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.629485 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dzmbg" event={"ID":"ef3c7ba4-eca6-4984-9f88-1ce1632970ca","Type":"ContainerDied","Data":"23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff"} Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.629513 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dzmbg" event={"ID":"ef3c7ba4-eca6-4984-9f88-1ce1632970ca","Type":"ContainerDied","Data":"d3f10b18409ccc4fa447e53eb30f54ab9a5e153b26d3e6a43ec54b38620e748c"} Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.629573 4907 scope.go:117] "RemoveContainer" containerID="23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.630397 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dzmbg" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.644604 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9pktm" podStartSLOduration=2.594138762 podStartE2EDuration="2.644584742s" podCreationTimestamp="2025-10-09 19:42:43 +0000 UTC" firstStartedPulling="2025-10-09 19:42:44.58133447 +0000 UTC m=+850.113301959" lastFinishedPulling="2025-10-09 19:42:44.63178045 +0000 UTC m=+850.163747939" observedRunningTime="2025-10-09 19:42:45.643971676 +0000 UTC m=+851.175939175" watchObservedRunningTime="2025-10-09 19:42:45.644584742 +0000 UTC m=+851.176552231" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.650665 4907 scope.go:117] "RemoveContainer" containerID="23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff" Oct 09 19:42:45 crc kubenswrapper[4907]: E1009 19:42:45.651139 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff\": container with ID starting with 23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff not found: ID does not exist" containerID="23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.655121 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff"} err="failed to get container status \"23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff\": rpc error: code = NotFound desc = could not find container \"23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff\": container with ID starting with 23689af8dc0294a307820b1c21bf05e759289bf8fccde1075f1b2bc82e7075ff not found: ID does not exist" Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.657779 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dzmbg"] Oct 09 19:42:45 crc kubenswrapper[4907]: I1009 19:42:45.670835 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dzmbg"] Oct 09 19:42:47 crc kubenswrapper[4907]: I1009 19:42:47.168344 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3c7ba4-eca6-4984-9f88-1ce1632970ca" path="/var/lib/kubelet/pods/ef3c7ba4-eca6-4984-9f88-1ce1632970ca/volumes" Oct 09 19:42:54 crc kubenswrapper[4907]: I1009 19:42:54.059724 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:42:54 crc kubenswrapper[4907]: I1009 19:42:54.060608 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:42:54 crc kubenswrapper[4907]: I1009 19:42:54.103234 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:42:54 crc kubenswrapper[4907]: I1009 19:42:54.733760 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-9pktm" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.594064 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9"] Oct 09 19:43:01 crc kubenswrapper[4907]: E1009 19:43:01.595193 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3c7ba4-eca6-4984-9f88-1ce1632970ca" containerName="registry-server" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.595218 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3c7ba4-eca6-4984-9f88-1ce1632970ca" containerName="registry-server" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.595499 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3c7ba4-eca6-4984-9f88-1ce1632970ca" containerName="registry-server" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.597375 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.602533 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2bdw6" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.615894 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9"] Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.727959 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-util\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.728514 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mjn\" (UniqueName: \"kubernetes.io/projected/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-kube-api-access-65mjn\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.728685 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-bundle\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.830740 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-util\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.830007 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-util\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.830906 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65mjn\" (UniqueName: \"kubernetes.io/projected/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-kube-api-access-65mjn\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.831522 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-bundle\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.832019 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-bundle\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.868382 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mjn\" (UniqueName: \"kubernetes.io/projected/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-kube-api-access-65mjn\") pod \"d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:01 crc kubenswrapper[4907]: I1009 19:43:01.925148 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:02 crc kubenswrapper[4907]: I1009 19:43:02.397425 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9"] Oct 09 19:43:02 crc kubenswrapper[4907]: I1009 19:43:02.762365 4907 generic.go:334] "Generic (PLEG): container finished" podID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerID="81809877433c7099e74abf16ac23c02f01de9e3501588eb4a037ef66aec304ed" exitCode=0 Oct 09 19:43:02 crc kubenswrapper[4907]: I1009 19:43:02.762818 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" event={"ID":"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3","Type":"ContainerDied","Data":"81809877433c7099e74abf16ac23c02f01de9e3501588eb4a037ef66aec304ed"} Oct 09 19:43:02 crc kubenswrapper[4907]: I1009 19:43:02.762902 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" event={"ID":"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3","Type":"ContainerStarted","Data":"37e536e6d327559327b5a263e4dec955cfad4ab406be009096b80ceeba9bb88c"} Oct 09 19:43:03 crc kubenswrapper[4907]: I1009 19:43:03.773579 4907 generic.go:334] "Generic (PLEG): container finished" podID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerID="1d9b219a2bc22c90749c5210f6f94cf95a01416e9776997508c192114a79b7ee" exitCode=0 Oct 09 19:43:03 crc kubenswrapper[4907]: I1009 19:43:03.773621 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" event={"ID":"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3","Type":"ContainerDied","Data":"1d9b219a2bc22c90749c5210f6f94cf95a01416e9776997508c192114a79b7ee"} Oct 09 19:43:04 crc kubenswrapper[4907]: I1009 19:43:04.782606 4907 generic.go:334] "Generic (PLEG): container finished" podID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerID="cccc0611a61d9e35db6632fd3059e1732eb0fcf9b4ee8a362a8e099e81a1834a" exitCode=0 Oct 09 19:43:04 crc kubenswrapper[4907]: I1009 19:43:04.783525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" event={"ID":"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3","Type":"ContainerDied","Data":"cccc0611a61d9e35db6632fd3059e1732eb0fcf9b4ee8a362a8e099e81a1834a"} Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.138321 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.299567 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.299645 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.303061 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-bundle\") pod \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.303131 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65mjn\" (UniqueName: \"kubernetes.io/projected/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-kube-api-access-65mjn\") pod \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.303170 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-util\") pod \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\" (UID: \"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3\") " Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.304002 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-bundle" (OuterVolumeSpecName: "bundle") pod "df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" (UID: "df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.311266 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-kube-api-access-65mjn" (OuterVolumeSpecName: "kube-api-access-65mjn") pod "df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" (UID: "df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3"). InnerVolumeSpecName "kube-api-access-65mjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.321379 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-util" (OuterVolumeSpecName: "util") pod "df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" (UID: "df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.405426 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-util\") on node \"crc\" DevicePath \"\"" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.405511 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.405530 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65mjn\" (UniqueName: \"kubernetes.io/projected/df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3-kube-api-access-65mjn\") on node \"crc\" DevicePath \"\"" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.804046 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" event={"ID":"df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3","Type":"ContainerDied","Data":"37e536e6d327559327b5a263e4dec955cfad4ab406be009096b80ceeba9bb88c"} Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.804119 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37e536e6d327559327b5a263e4dec955cfad4ab406be009096b80ceeba9bb88c" Oct 09 19:43:06 crc kubenswrapper[4907]: I1009 19:43:06.804152 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.262132 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh"] Oct 09 19:43:09 crc kubenswrapper[4907]: E1009 19:43:09.262771 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerName="extract" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.262788 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerName="extract" Oct 09 19:43:09 crc kubenswrapper[4907]: E1009 19:43:09.262802 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerName="util" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.262810 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerName="util" Oct 09 19:43:09 crc kubenswrapper[4907]: E1009 19:43:09.262833 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerName="pull" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.262843 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerName="pull" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.262981 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3" containerName="extract" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.263774 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.266559 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-l8dvh" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.285576 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh"] Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.346450 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc6t4\" (UniqueName: \"kubernetes.io/projected/715d1a2f-5bf3-4ef7-9086-c1f450daa6eb-kube-api-access-zc6t4\") pod \"openstack-operator-controller-operator-88fd6dc46-qtcmh\" (UID: \"715d1a2f-5bf3-4ef7-9086-c1f450daa6eb\") " pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.447591 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc6t4\" (UniqueName: \"kubernetes.io/projected/715d1a2f-5bf3-4ef7-9086-c1f450daa6eb-kube-api-access-zc6t4\") pod \"openstack-operator-controller-operator-88fd6dc46-qtcmh\" (UID: \"715d1a2f-5bf3-4ef7-9086-c1f450daa6eb\") " pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.469743 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc6t4\" (UniqueName: \"kubernetes.io/projected/715d1a2f-5bf3-4ef7-9086-c1f450daa6eb-kube-api-access-zc6t4\") pod \"openstack-operator-controller-operator-88fd6dc46-qtcmh\" (UID: \"715d1a2f-5bf3-4ef7-9086-c1f450daa6eb\") " pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" Oct 09 19:43:09 crc kubenswrapper[4907]: I1009 19:43:09.580376 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" Oct 09 19:43:10 crc kubenswrapper[4907]: I1009 19:43:10.001157 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh"] Oct 09 19:43:10 crc kubenswrapper[4907]: I1009 19:43:10.833592 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" event={"ID":"715d1a2f-5bf3-4ef7-9086-c1f450daa6eb","Type":"ContainerStarted","Data":"7cde9952ada14c2c2e10751f88cdf805e1bca42e1f97bc80f39840d6042355b7"} Oct 09 19:43:13 crc kubenswrapper[4907]: I1009 19:43:13.852819 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" event={"ID":"715d1a2f-5bf3-4ef7-9086-c1f450daa6eb","Type":"ContainerStarted","Data":"f5d9f40446d5ec0a2b4cb919dba47b9826f4a497c2b43bc78a6c22ee1bee6be6"} Oct 09 19:43:16 crc kubenswrapper[4907]: I1009 19:43:16.871898 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" event={"ID":"715d1a2f-5bf3-4ef7-9086-c1f450daa6eb","Type":"ContainerStarted","Data":"6b92e39d78c5e69306aa9570fab879896cb3451763a11e0644355d4c38bc369a"} Oct 09 19:43:16 crc kubenswrapper[4907]: I1009 19:43:16.872296 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" Oct 09 19:43:16 crc kubenswrapper[4907]: I1009 19:43:16.931031 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" podStartSLOduration=2.135777797 podStartE2EDuration="7.931015006s" podCreationTimestamp="2025-10-09 19:43:09 +0000 UTC" firstStartedPulling="2025-10-09 19:43:10.004271827 +0000 UTC m=+875.536239326" lastFinishedPulling="2025-10-09 19:43:15.799509036 +0000 UTC m=+881.331476535" observedRunningTime="2025-10-09 19:43:16.921944772 +0000 UTC m=+882.453912281" watchObservedRunningTime="2025-10-09 19:43:16.931015006 +0000 UTC m=+882.462982485" Oct 09 19:43:19 crc kubenswrapper[4907]: I1009 19:43:19.584712 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-88fd6dc46-qtcmh" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.299160 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.299824 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.382857 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.384144 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.387998 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c6cqd" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.393928 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.409200 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.410427 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.412934 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-w57tf" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.431886 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h54m\" (UniqueName: \"kubernetes.io/projected/396b2bde-8328-4285-81a3-58d361096cf8-kube-api-access-8h54m\") pod \"barbican-operator-controller-manager-64f84fcdbb-jzjwr\" (UID: \"396b2bde-8328-4285-81a3-58d361096cf8\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.431978 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2q7n\" (UniqueName: \"kubernetes.io/projected/b1701060-cf14-4dfc-9545-5b63be29728a-kube-api-access-v2q7n\") pod \"cinder-operator-controller-manager-59cdc64769-5pcmk\" (UID: \"b1701060-cf14-4dfc-9545-5b63be29728a\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.464641 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.488615 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.490402 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.493222 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hvq4r" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.520150 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.528788 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.529181 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.535087 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-x7q8m" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.535500 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsv75\" (UniqueName: \"kubernetes.io/projected/1353b956-2119-4690-be09-9f9b788737a5-kube-api-access-lsv75\") pod \"designate-operator-controller-manager-687df44cdb-5d6rr\" (UID: \"1353b956-2119-4690-be09-9f9b788737a5\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.535549 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2q7n\" (UniqueName: \"kubernetes.io/projected/b1701060-cf14-4dfc-9545-5b63be29728a-kube-api-access-v2q7n\") pod \"cinder-operator-controller-manager-59cdc64769-5pcmk\" (UID: \"b1701060-cf14-4dfc-9545-5b63be29728a\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.535599 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h54m\" (UniqueName: \"kubernetes.io/projected/396b2bde-8328-4285-81a3-58d361096cf8-kube-api-access-8h54m\") pod \"barbican-operator-controller-manager-64f84fcdbb-jzjwr\" (UID: \"396b2bde-8328-4285-81a3-58d361096cf8\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.554559 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.555694 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.559430 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mqmjn" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.571342 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h54m\" (UniqueName: \"kubernetes.io/projected/396b2bde-8328-4285-81a3-58d361096cf8-kube-api-access-8h54m\") pod \"barbican-operator-controller-manager-64f84fcdbb-jzjwr\" (UID: \"396b2bde-8328-4285-81a3-58d361096cf8\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.571402 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.577023 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2q7n\" (UniqueName: \"kubernetes.io/projected/b1701060-cf14-4dfc-9545-5b63be29728a-kube-api-access-v2q7n\") pod \"cinder-operator-controller-manager-59cdc64769-5pcmk\" (UID: \"b1701060-cf14-4dfc-9545-5b63be29728a\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.577104 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.583435 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.585641 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.590846 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-cw2xp" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.595196 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.596645 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.602927 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rxdzr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.603177 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.610721 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.628701 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.637066 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfcln\" (UniqueName: \"kubernetes.io/projected/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-kube-api-access-xfcln\") pod \"infra-operator-controller-manager-585fc5b659-zwj6t\" (UID: \"5870b9a9-c7a2-4e57-b917-e5a41c20dc55\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.637133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh4tb\" (UniqueName: \"kubernetes.io/projected/aa1daa5a-4e9e-4378-81ad-0dab2895f34a-kube-api-access-gh4tb\") pod \"heat-operator-controller-manager-6d9967f8dd-d2zsg\" (UID: \"aa1daa5a-4e9e-4378-81ad-0dab2895f34a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.637154 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqs5\" (UniqueName: \"kubernetes.io/projected/8c17b476-94f3-4391-a755-e816a5ed56e0-kube-api-access-nhqs5\") pod \"horizon-operator-controller-manager-6d74794d9b-595rv\" (UID: \"8c17b476-94f3-4391-a755-e816a5ed56e0\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.637169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-cert\") pod \"infra-operator-controller-manager-585fc5b659-zwj6t\" (UID: \"5870b9a9-c7a2-4e57-b917-e5a41c20dc55\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.637214 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsv75\" (UniqueName: \"kubernetes.io/projected/1353b956-2119-4690-be09-9f9b788737a5-kube-api-access-lsv75\") pod \"designate-operator-controller-manager-687df44cdb-5d6rr\" (UID: \"1353b956-2119-4690-be09-9f9b788737a5\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.637261 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwxl\" (UniqueName: \"kubernetes.io/projected/cdc3d576-f3e6-4016-8856-ff8e5e6cf299-kube-api-access-5xwxl\") pod \"glance-operator-controller-manager-7bb46cd7d-tpg5p\" (UID: \"cdc3d576-f3e6-4016-8856-ff8e5e6cf299\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.651722 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.653183 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.658597 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vp85b" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.669111 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.670687 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.679536 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.679735 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rc829" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.688280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsv75\" (UniqueName: \"kubernetes.io/projected/1353b956-2119-4690-be09-9f9b788737a5-kube-api-access-lsv75\") pod \"designate-operator-controller-manager-687df44cdb-5d6rr\" (UID: \"1353b956-2119-4690-be09-9f9b788737a5\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.704742 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.736676 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.737888 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.738940 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwxl\" (UniqueName: \"kubernetes.io/projected/cdc3d576-f3e6-4016-8856-ff8e5e6cf299-kube-api-access-5xwxl\") pod \"glance-operator-controller-manager-7bb46cd7d-tpg5p\" (UID: \"cdc3d576-f3e6-4016-8856-ff8e5e6cf299\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.738982 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfcln\" (UniqueName: \"kubernetes.io/projected/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-kube-api-access-xfcln\") pod \"infra-operator-controller-manager-585fc5b659-zwj6t\" (UID: \"5870b9a9-c7a2-4e57-b917-e5a41c20dc55\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.739013 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjfm8\" (UniqueName: \"kubernetes.io/projected/71138822-c6a1-4657-a640-9350e6e6965c-kube-api-access-xjfm8\") pod \"ironic-operator-controller-manager-74cb5cbc49-khm2s\" (UID: \"71138822-c6a1-4657-a640-9350e6e6965c\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.739036 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmjm\" (UniqueName: \"kubernetes.io/projected/364dc10d-b5b4-4c0e-a480-7dc371fc6a0d-kube-api-access-glmjm\") pod \"keystone-operator-controller-manager-ddb98f99b-hjhwf\" (UID: \"364dc10d-b5b4-4c0e-a480-7dc371fc6a0d\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.739055 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh4tb\" (UniqueName: \"kubernetes.io/projected/aa1daa5a-4e9e-4378-81ad-0dab2895f34a-kube-api-access-gh4tb\") pod \"heat-operator-controller-manager-6d9967f8dd-d2zsg\" (UID: \"aa1daa5a-4e9e-4378-81ad-0dab2895f34a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.739073 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqs5\" (UniqueName: \"kubernetes.io/projected/8c17b476-94f3-4391-a755-e816a5ed56e0-kube-api-access-nhqs5\") pod \"horizon-operator-controller-manager-6d74794d9b-595rv\" (UID: \"8c17b476-94f3-4391-a755-e816a5ed56e0\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.739086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-cert\") pod \"infra-operator-controller-manager-585fc5b659-zwj6t\" (UID: \"5870b9a9-c7a2-4e57-b917-e5a41c20dc55\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:36 crc kubenswrapper[4907]: E1009 19:43:36.739226 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 09 19:43:36 crc kubenswrapper[4907]: E1009 19:43:36.739268 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-cert podName:5870b9a9-c7a2-4e57-b917-e5a41c20dc55 nodeName:}" failed. No retries permitted until 2025-10-09 19:43:37.239253912 +0000 UTC m=+902.771221401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-cert") pod "infra-operator-controller-manager-585fc5b659-zwj6t" (UID: "5870b9a9-c7a2-4e57-b917-e5a41c20dc55") : secret "infra-operator-webhook-server-cert" not found Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.743115 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5vg6f" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.754116 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.757084 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfcln\" (UniqueName: \"kubernetes.io/projected/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-kube-api-access-xfcln\") pod \"infra-operator-controller-manager-585fc5b659-zwj6t\" (UID: \"5870b9a9-c7a2-4e57-b917-e5a41c20dc55\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.762463 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.766154 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwxl\" (UniqueName: \"kubernetes.io/projected/cdc3d576-f3e6-4016-8856-ff8e5e6cf299-kube-api-access-5xwxl\") pod \"glance-operator-controller-manager-7bb46cd7d-tpg5p\" (UID: \"cdc3d576-f3e6-4016-8856-ff8e5e6cf299\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.766315 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqs5\" (UniqueName: \"kubernetes.io/projected/8c17b476-94f3-4391-a755-e816a5ed56e0-kube-api-access-nhqs5\") pod \"horizon-operator-controller-manager-6d74794d9b-595rv\" (UID: \"8c17b476-94f3-4391-a755-e816a5ed56e0\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.770770 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh4tb\" (UniqueName: \"kubernetes.io/projected/aa1daa5a-4e9e-4378-81ad-0dab2895f34a-kube-api-access-gh4tb\") pod \"heat-operator-controller-manager-6d9967f8dd-d2zsg\" (UID: \"aa1daa5a-4e9e-4378-81ad-0dab2895f34a\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.785619 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.786783 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.788348 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jp8zf" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.802737 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.815973 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.825473 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.825990 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.827184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.830175 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-wf4wk" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.839831 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hds92\" (UniqueName: \"kubernetes.io/projected/e5a81d4d-968e-43b0-b53a-e5c475773a29-kube-api-access-hds92\") pod \"mariadb-operator-controller-manager-5777b4f897-q2flj\" (UID: \"e5a81d4d-968e-43b0-b53a-e5c475773a29\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.839876 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk269\" (UniqueName: \"kubernetes.io/projected/9497c9e0-df89-48ae-be07-7df3e532bb35-kube-api-access-nk269\") pod \"manila-operator-controller-manager-59578bc799-jnbdz\" (UID: \"9497c9e0-df89-48ae-be07-7df3e532bb35\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.839916 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjfm8\" (UniqueName: \"kubernetes.io/projected/71138822-c6a1-4657-a640-9350e6e6965c-kube-api-access-xjfm8\") pod \"ironic-operator-controller-manager-74cb5cbc49-khm2s\" (UID: \"71138822-c6a1-4657-a640-9350e6e6965c\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.839937 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmjm\" (UniqueName: \"kubernetes.io/projected/364dc10d-b5b4-4c0e-a480-7dc371fc6a0d-kube-api-access-glmjm\") pod \"keystone-operator-controller-manager-ddb98f99b-hjhwf\" (UID: \"364dc10d-b5b4-4c0e-a480-7dc371fc6a0d\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.852437 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.853996 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.856866 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjfm8\" (UniqueName: \"kubernetes.io/projected/71138822-c6a1-4657-a640-9350e6e6965c-kube-api-access-xjfm8\") pod \"ironic-operator-controller-manager-74cb5cbc49-khm2s\" (UID: \"71138822-c6a1-4657-a640-9350e6e6965c\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.868144 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmjm\" (UniqueName: \"kubernetes.io/projected/364dc10d-b5b4-4c0e-a480-7dc371fc6a0d-kube-api-access-glmjm\") pod \"keystone-operator-controller-manager-ddb98f99b-hjhwf\" (UID: \"364dc10d-b5b4-4c0e-a480-7dc371fc6a0d\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.874880 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.875934 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.877582 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-pf7gs" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.879323 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.880342 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.882464 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-z9gv6" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.886047 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.900616 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.913445 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.917957 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.919586 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.920767 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-w26gp" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.941249 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/523bbf58-dcf0-49f5-a198-24878c574c70-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6\" (UID: \"523bbf58-dcf0-49f5-a198-24878c574c70\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.941294 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxct\" (UniqueName: \"kubernetes.io/projected/10b627f1-74be-41c8-a7e7-367beb0a828d-kube-api-access-vgxct\") pod \"octavia-operator-controller-manager-6d7c7ddf95-z5klh\" (UID: \"10b627f1-74be-41c8-a7e7-367beb0a828d\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.941369 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk269\" (UniqueName: \"kubernetes.io/projected/9497c9e0-df89-48ae-be07-7df3e532bb35-kube-api-access-nk269\") pod \"manila-operator-controller-manager-59578bc799-jnbdz\" (UID: \"9497c9e0-df89-48ae-be07-7df3e532bb35\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.941406 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzmf\" (UniqueName: \"kubernetes.io/projected/ddffdb06-43eb-44db-9afa-a56e2c6b467c-kube-api-access-5xzmf\") pod \"nova-operator-controller-manager-57bb74c7bf-4bsn7\" (UID: \"ddffdb06-43eb-44db-9afa-a56e2c6b467c\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.941449 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsr6w\" (UniqueName: \"kubernetes.io/projected/523bbf58-dcf0-49f5-a198-24878c574c70-kube-api-access-dsr6w\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6\" (UID: \"523bbf58-dcf0-49f5-a198-24878c574c70\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.941496 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx59l\" (UniqueName: \"kubernetes.io/projected/759e961c-957a-436b-80cd-14294fce30ad-kube-api-access-bx59l\") pod \"neutron-operator-controller-manager-797d478b46-t46vt\" (UID: \"759e961c-957a-436b-80cd-14294fce30ad\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.941543 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hds92\" (UniqueName: \"kubernetes.io/projected/e5a81d4d-968e-43b0-b53a-e5c475773a29-kube-api-access-hds92\") pod \"mariadb-operator-controller-manager-5777b4f897-q2flj\" (UID: \"e5a81d4d-968e-43b0-b53a-e5c475773a29\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.946189 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.957899 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.962104 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk269\" (UniqueName: \"kubernetes.io/projected/9497c9e0-df89-48ae-be07-7df3e532bb35-kube-api-access-nk269\") pod \"manila-operator-controller-manager-59578bc799-jnbdz\" (UID: \"9497c9e0-df89-48ae-be07-7df3e532bb35\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.967903 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hds92\" (UniqueName: \"kubernetes.io/projected/e5a81d4d-968e-43b0-b53a-e5c475773a29-kube-api-access-hds92\") pod \"mariadb-operator-controller-manager-5777b4f897-q2flj\" (UID: \"e5a81d4d-968e-43b0-b53a-e5c475773a29\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.977982 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.984323 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6"] Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.984474 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" Oct 09 19:43:36 crc kubenswrapper[4907]: I1009 19:43:36.999032 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-9srqs" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.008073 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.014122 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.027373 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-sk2rm" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.035759 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.040174 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.042902 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.043678 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxct\" (UniqueName: \"kubernetes.io/projected/10b627f1-74be-41c8-a7e7-367beb0a828d-kube-api-access-vgxct\") pod \"octavia-operator-controller-manager-6d7c7ddf95-z5klh\" (UID: \"10b627f1-74be-41c8-a7e7-367beb0a828d\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.043821 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzmf\" (UniqueName: \"kubernetes.io/projected/ddffdb06-43eb-44db-9afa-a56e2c6b467c-kube-api-access-5xzmf\") pod \"nova-operator-controller-manager-57bb74c7bf-4bsn7\" (UID: \"ddffdb06-43eb-44db-9afa-a56e2c6b467c\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.043920 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsr6w\" (UniqueName: \"kubernetes.io/projected/523bbf58-dcf0-49f5-a198-24878c574c70-kube-api-access-dsr6w\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6\" (UID: \"523bbf58-dcf0-49f5-a198-24878c574c70\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.044006 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrr22\" (UniqueName: \"kubernetes.io/projected/94e5bb04-8b14-4518-846b-721c24bc2348-kube-api-access-zrr22\") pod \"placement-operator-controller-manager-664664cb68-gbxbh\" (UID: \"94e5bb04-8b14-4518-846b-721c24bc2348\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.044089 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx59l\" (UniqueName: \"kubernetes.io/projected/759e961c-957a-436b-80cd-14294fce30ad-kube-api-access-bx59l\") pod \"neutron-operator-controller-manager-797d478b46-t46vt\" (UID: \"759e961c-957a-436b-80cd-14294fce30ad\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.044207 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5ktg\" (UniqueName: \"kubernetes.io/projected/64d8141d-db49-44dd-90bc-20b75a642c99-kube-api-access-t5ktg\") pod \"ovn-operator-controller-manager-869cc7797f-v87t7\" (UID: \"64d8141d-db49-44dd-90bc-20b75a642c99\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.044295 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/523bbf58-dcf0-49f5-a198-24878c574c70-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6\" (UID: \"523bbf58-dcf0-49f5-a198-24878c574c70\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:37 crc kubenswrapper[4907]: E1009 19:43:37.044454 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 19:43:37 crc kubenswrapper[4907]: E1009 19:43:37.044587 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/523bbf58-dcf0-49f5-a198-24878c574c70-cert podName:523bbf58-dcf0-49f5-a198-24878c574c70 nodeName:}" failed. No retries permitted until 2025-10-09 19:43:37.544569928 +0000 UTC m=+903.076537407 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/523bbf58-dcf0-49f5-a198-24878c574c70-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" (UID: "523bbf58-dcf0-49f5-a198-24878c574c70") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.045075 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.049644 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-l4j9z" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.050951 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.062277 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.071058 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.071172 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.075411 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx59l\" (UniqueName: \"kubernetes.io/projected/759e961c-957a-436b-80cd-14294fce30ad-kube-api-access-bx59l\") pod \"neutron-operator-controller-manager-797d478b46-t46vt\" (UID: \"759e961c-957a-436b-80cd-14294fce30ad\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.078534 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rv7cl" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.084807 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.087252 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzmf\" (UniqueName: \"kubernetes.io/projected/ddffdb06-43eb-44db-9afa-a56e2c6b467c-kube-api-access-5xzmf\") pod \"nova-operator-controller-manager-57bb74c7bf-4bsn7\" (UID: \"ddffdb06-43eb-44db-9afa-a56e2c6b467c\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.097984 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.099340 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.099955 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.100475 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsr6w\" (UniqueName: \"kubernetes.io/projected/523bbf58-dcf0-49f5-a198-24878c574c70-kube-api-access-dsr6w\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6\" (UID: \"523bbf58-dcf0-49f5-a198-24878c574c70\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.105075 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxct\" (UniqueName: \"kubernetes.io/projected/10b627f1-74be-41c8-a7e7-367beb0a828d-kube-api-access-vgxct\") pod \"octavia-operator-controller-manager-6d7c7ddf95-z5klh\" (UID: \"10b627f1-74be-41c8-a7e7-367beb0a828d\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.109429 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.113033 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mvztx" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.120527 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.130779 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.145770 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn8hb\" (UniqueName: \"kubernetes.io/projected/7150c799-4a61-4c14-9471-99fbc61a8f7b-kube-api-access-fn8hb\") pod \"telemetry-operator-controller-manager-895c94468-xtfng\" (UID: \"7150c799-4a61-4c14-9471-99fbc61a8f7b\") " pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.145830 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbwf6\" (UniqueName: \"kubernetes.io/projected/1313e0f0-b372-43cd-8f32-7c6bd566ab1a-kube-api-access-lbwf6\") pod \"test-operator-controller-manager-ffcdd6c94-9g4cj\" (UID: \"1313e0f0-b372-43cd-8f32-7c6bd566ab1a\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.145847 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4k6l\" (UniqueName: \"kubernetes.io/projected/e51f1489-6999-474e-9ae4-5f8598e608d0-kube-api-access-n4k6l\") pod \"swift-operator-controller-manager-5f4d5dfdc6-8ztd5\" (UID: \"e51f1489-6999-474e-9ae4-5f8598e608d0\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.145873 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrr22\" (UniqueName: \"kubernetes.io/projected/94e5bb04-8b14-4518-846b-721c24bc2348-kube-api-access-zrr22\") pod \"placement-operator-controller-manager-664664cb68-gbxbh\" (UID: \"94e5bb04-8b14-4518-846b-721c24bc2348\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.145915 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5ktg\" (UniqueName: \"kubernetes.io/projected/64d8141d-db49-44dd-90bc-20b75a642c99-kube-api-access-t5ktg\") pod \"ovn-operator-controller-manager-869cc7797f-v87t7\" (UID: \"64d8141d-db49-44dd-90bc-20b75a642c99\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.151640 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.169027 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5ktg\" (UniqueName: \"kubernetes.io/projected/64d8141d-db49-44dd-90bc-20b75a642c99-kube-api-access-t5ktg\") pod \"ovn-operator-controller-manager-869cc7797f-v87t7\" (UID: \"64d8141d-db49-44dd-90bc-20b75a642c99\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.194508 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrr22\" (UniqueName: \"kubernetes.io/projected/94e5bb04-8b14-4518-846b-721c24bc2348-kube-api-access-zrr22\") pod \"placement-operator-controller-manager-664664cb68-gbxbh\" (UID: \"94e5bb04-8b14-4518-846b-721c24bc2348\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.221317 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.231689 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.239904 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-p5c55"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.244729 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.247135 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-cert\") pod \"infra-operator-controller-manager-585fc5b659-zwj6t\" (UID: \"5870b9a9-c7a2-4e57-b917-e5a41c20dc55\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.247174 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbwf6\" (UniqueName: \"kubernetes.io/projected/1313e0f0-b372-43cd-8f32-7c6bd566ab1a-kube-api-access-lbwf6\") pod \"test-operator-controller-manager-ffcdd6c94-9g4cj\" (UID: \"1313e0f0-b372-43cd-8f32-7c6bd566ab1a\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.247192 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4k6l\" (UniqueName: \"kubernetes.io/projected/e51f1489-6999-474e-9ae4-5f8598e608d0-kube-api-access-n4k6l\") pod \"swift-operator-controller-manager-5f4d5dfdc6-8ztd5\" (UID: \"e51f1489-6999-474e-9ae4-5f8598e608d0\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.247322 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn8hb\" (UniqueName: \"kubernetes.io/projected/7150c799-4a61-4c14-9471-99fbc61a8f7b-kube-api-access-fn8hb\") pod \"telemetry-operator-controller-manager-895c94468-xtfng\" (UID: \"7150c799-4a61-4c14-9471-99fbc61a8f7b\") " pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.253186 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-9xc66" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.267293 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5870b9a9-c7a2-4e57-b917-e5a41c20dc55-cert\") pod \"infra-operator-controller-manager-585fc5b659-zwj6t\" (UID: \"5870b9a9-c7a2-4e57-b917-e5a41c20dc55\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.273847 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4k6l\" (UniqueName: \"kubernetes.io/projected/e51f1489-6999-474e-9ae4-5f8598e608d0-kube-api-access-n4k6l\") pod \"swift-operator-controller-manager-5f4d5dfdc6-8ztd5\" (UID: \"e51f1489-6999-474e-9ae4-5f8598e608d0\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.274895 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-p5c55"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.275576 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn8hb\" (UniqueName: \"kubernetes.io/projected/7150c799-4a61-4c14-9471-99fbc61a8f7b-kube-api-access-fn8hb\") pod \"telemetry-operator-controller-manager-895c94468-xtfng\" (UID: \"7150c799-4a61-4c14-9471-99fbc61a8f7b\") " pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.283073 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbwf6\" (UniqueName: \"kubernetes.io/projected/1313e0f0-b372-43cd-8f32-7c6bd566ab1a-kube-api-access-lbwf6\") pod \"test-operator-controller-manager-ffcdd6c94-9g4cj\" (UID: \"1313e0f0-b372-43cd-8f32-7c6bd566ab1a\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.305704 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.307722 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.310915 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.311254 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ns9pw" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.319805 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.329175 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.340869 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.343341 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.343374 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.346212 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-f5xgr" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.358995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-cert\") pod \"openstack-operator-controller-manager-6d899f9cc7-5mhzg\" (UID: \"385f8c3a-5a7a-4214-a8cf-9c6886264ea9\") " pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.359048 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzz9l\" (UniqueName: \"kubernetes.io/projected/5681fed3-74e3-4e40-beff-cebbe06023e4-kube-api-access-vzz9l\") pod \"watcher-operator-controller-manager-646675d848-p5c55\" (UID: \"5681fed3-74e3-4e40-beff-cebbe06023e4\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.359112 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6vsl\" (UniqueName: \"kubernetes.io/projected/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-kube-api-access-h6vsl\") pod \"openstack-operator-controller-manager-6d899f9cc7-5mhzg\" (UID: \"385f8c3a-5a7a-4214-a8cf-9c6886264ea9\") " pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.376776 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.388775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.393971 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.419113 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.447057 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.453037 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.463284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-cert\") pod \"openstack-operator-controller-manager-6d899f9cc7-5mhzg\" (UID: \"385f8c3a-5a7a-4214-a8cf-9c6886264ea9\") " pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.463413 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzz9l\" (UniqueName: \"kubernetes.io/projected/5681fed3-74e3-4e40-beff-cebbe06023e4-kube-api-access-vzz9l\") pod \"watcher-operator-controller-manager-646675d848-p5c55\" (UID: \"5681fed3-74e3-4e40-beff-cebbe06023e4\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.463766 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzlz\" (UniqueName: \"kubernetes.io/projected/9ca2b641-57af-45b8-b0aa-3b45b08d13a7-kube-api-access-5vzlz\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5\" (UID: \"9ca2b641-57af-45b8-b0aa-3b45b08d13a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.463814 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6vsl\" (UniqueName: \"kubernetes.io/projected/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-kube-api-access-h6vsl\") pod \"openstack-operator-controller-manager-6d899f9cc7-5mhzg\" (UID: \"385f8c3a-5a7a-4214-a8cf-9c6886264ea9\") " pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:37 crc kubenswrapper[4907]: E1009 19:43:37.464686 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 09 19:43:37 crc kubenswrapper[4907]: E1009 19:43:37.464738 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-cert podName:385f8c3a-5a7a-4214-a8cf-9c6886264ea9 nodeName:}" failed. No retries permitted until 2025-10-09 19:43:37.964724502 +0000 UTC m=+903.496691991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-cert") pod "openstack-operator-controller-manager-6d899f9cc7-5mhzg" (UID: "385f8c3a-5a7a-4214-a8cf-9c6886264ea9") : secret "webhook-server-cert" not found Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.482510 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.499437 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzz9l\" (UniqueName: \"kubernetes.io/projected/5681fed3-74e3-4e40-beff-cebbe06023e4-kube-api-access-vzz9l\") pod \"watcher-operator-controller-manager-646675d848-p5c55\" (UID: \"5681fed3-74e3-4e40-beff-cebbe06023e4\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.501104 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6vsl\" (UniqueName: \"kubernetes.io/projected/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-kube-api-access-h6vsl\") pod \"openstack-operator-controller-manager-6d899f9cc7-5mhzg\" (UID: \"385f8c3a-5a7a-4214-a8cf-9c6886264ea9\") " pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.566254 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzlz\" (UniqueName: \"kubernetes.io/projected/9ca2b641-57af-45b8-b0aa-3b45b08d13a7-kube-api-access-5vzlz\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5\" (UID: \"9ca2b641-57af-45b8-b0aa-3b45b08d13a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.566306 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/523bbf58-dcf0-49f5-a198-24878c574c70-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6\" (UID: \"523bbf58-dcf0-49f5-a198-24878c574c70\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.570627 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/523bbf58-dcf0-49f5-a198-24878c574c70-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6\" (UID: \"523bbf58-dcf0-49f5-a198-24878c574c70\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.573142 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.584578 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.601756 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzlz\" (UniqueName: \"kubernetes.io/projected/9ca2b641-57af-45b8-b0aa-3b45b08d13a7-kube-api-access-5vzlz\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5\" (UID: \"9ca2b641-57af-45b8-b0aa-3b45b08d13a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.603375 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p"] Oct 09 19:43:37 crc kubenswrapper[4907]: W1009 19:43:37.650249 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc3d576_f3e6_4016_8856_ff8e5e6cf299.slice/crio-774d74db3443c90b289643ec9a2554b1a21291007b22c9d9e9de1f92b05b0174 WatchSource:0}: Error finding container 774d74db3443c90b289643ec9a2554b1a21291007b22c9d9e9de1f92b05b0174: Status 404 returned error can't find the container with id 774d74db3443c90b289643ec9a2554b1a21291007b22c9d9e9de1f92b05b0174 Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.672522 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.707958 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.844430 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.851050 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv"] Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.974377 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-cert\") pod \"openstack-operator-controller-manager-6d899f9cc7-5mhzg\" (UID: \"385f8c3a-5a7a-4214-a8cf-9c6886264ea9\") " pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:37 crc kubenswrapper[4907]: I1009 19:43:37.979934 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/385f8c3a-5a7a-4214-a8cf-9c6886264ea9-cert\") pod \"openstack-operator-controller-manager-6d899f9cc7-5mhzg\" (UID: \"385f8c3a-5a7a-4214-a8cf-9c6886264ea9\") " pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.037189 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" event={"ID":"aa1daa5a-4e9e-4378-81ad-0dab2895f34a","Type":"ContainerStarted","Data":"fce4fb7e0c7f2af67382a66b7b7d0f6789604c57ead30480aea8c1f7937087e4"} Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.043144 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" event={"ID":"b1701060-cf14-4dfc-9545-5b63be29728a","Type":"ContainerStarted","Data":"78cd60336efffa7d81372113595ff507673adf6ecadfa2240cf0d42520c2c5b8"} Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.045696 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" event={"ID":"8c17b476-94f3-4391-a755-e816a5ed56e0","Type":"ContainerStarted","Data":"c0a56be536c7080f32249dd325fceb279c2c99e84f7b9ce5c7e449678ffc4fe0"} Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.046631 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" event={"ID":"396b2bde-8328-4285-81a3-58d361096cf8","Type":"ContainerStarted","Data":"126ac5b3693827017cc171cda0ae79e6b1d217bb9632990acd4c52d5effbe2b6"} Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.050740 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" event={"ID":"cdc3d576-f3e6-4016-8856-ff8e5e6cf299","Type":"ContainerStarted","Data":"774d74db3443c90b289643ec9a2554b1a21291007b22c9d9e9de1f92b05b0174"} Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.055847 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" event={"ID":"1353b956-2119-4690-be09-9f9b788737a5","Type":"ContainerStarted","Data":"67cdbe09e98957e472afd7f9a089d7c493279cf9a0d0df488b3a85d11b4f140f"} Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.249892 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.301874 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s"] Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.311911 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj"] Oct 09 19:43:38 crc kubenswrapper[4907]: W1009 19:43:38.318263 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a81d4d_968e_43b0_b53a_e5c475773a29.slice/crio-d9acb934c3b297a37f9895fde1606bcace0dc246b9d410189cf64d62c96fa855 WatchSource:0}: Error finding container d9acb934c3b297a37f9895fde1606bcace0dc246b9d410189cf64d62c96fa855: Status 404 returned error can't find the container with id d9acb934c3b297a37f9895fde1606bcace0dc246b9d410189cf64d62c96fa855 Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.536437 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh"] Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.556276 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7"] Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.592383 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz"] Oct 09 19:43:38 crc kubenswrapper[4907]: W1009 19:43:38.602605 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b627f1_74be_41c8_a7e7_367beb0a828d.slice/crio-adeb8b36f321be336d7376e7bfb803454f605082aabc733a3d5d295a9055d548 WatchSource:0}: Error finding container adeb8b36f321be336d7376e7bfb803454f605082aabc733a3d5d295a9055d548: Status 404 returned error can't find the container with id adeb8b36f321be336d7376e7bfb803454f605082aabc733a3d5d295a9055d548 Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.630894 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.757643 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf"] Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.773085 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt"] Oct 09 19:43:38 crc kubenswrapper[4907]: W1009 19:43:38.787328 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod759e961c_957a_436b_80cd_14294fce30ad.slice/crio-59f2b4a14e9139c866f81ae54b8c40143b68dc5bdbdaebbcdb2c5278e28c0798 WatchSource:0}: Error finding container 59f2b4a14e9139c866f81ae54b8c40143b68dc5bdbdaebbcdb2c5278e28c0798: Status 404 returned error can't find the container with id 59f2b4a14e9139c866f81ae54b8c40143b68dc5bdbdaebbcdb2c5278e28c0798 Oct 09 19:43:38 crc kubenswrapper[4907]: I1009 19:43:38.982764 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5"] Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.002724 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-p5c55"] Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.013058 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5"] Oct 09 19:43:39 crc kubenswrapper[4907]: W1009 19:43:39.029765 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e5bb04_8b14_4518_846b_721c24bc2348.slice/crio-0e2fb422cd8b5ca8ee781acfa1be7313bf8d0cadd2d391c9dfb103e066019f8d WatchSource:0}: Error finding container 0e2fb422cd8b5ca8ee781acfa1be7313bf8d0cadd2d391c9dfb103e066019f8d: Status 404 returned error can't find the container with id 0e2fb422cd8b5ca8ee781acfa1be7313bf8d0cadd2d391c9dfb103e066019f8d Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.041666 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj"] Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.045818 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh"] Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.049978 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng"] Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.053847 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7"] Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.068497 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t"] Oct 09 19:43:39 crc kubenswrapper[4907]: E1009 19:43:39.074306 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t5ktg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-869cc7797f-v87t7_openstack-operators(64d8141d-db49-44dd-90bc-20b75a642c99): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 19:43:39 crc kubenswrapper[4907]: E1009 19:43:39.074316 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xfcln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-585fc5b659-zwj6t_openstack-operators(5870b9a9-c7a2-4e57-b917-e5a41c20dc55): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 19:43:39 crc kubenswrapper[4907]: E1009 19:43:39.074444 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbwf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-ffcdd6c94-9g4cj_openstack-operators(1313e0f0-b372-43cd-8f32-7c6bd566ab1a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.074820 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5" event={"ID":"9ca2b641-57af-45b8-b0aa-3b45b08d13a7","Type":"ContainerStarted","Data":"9fed9e7cbd994d243368c03bf63e23eb56e30348abecb45c7cff414eba3afd62"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.077274 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" event={"ID":"9497c9e0-df89-48ae-be07-7df3e532bb35","Type":"ContainerStarted","Data":"a10947264a63850877560ec945908326e97d5ee087e503a2d35b4d0b3a8082a9"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.081737 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" event={"ID":"1313e0f0-b372-43cd-8f32-7c6bd566ab1a","Type":"ContainerStarted","Data":"d31c67228f1be7addc97878060520f2042f3eb91e7841f588e6a03ed855017f5"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.083624 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" event={"ID":"e51f1489-6999-474e-9ae4-5f8598e608d0","Type":"ContainerStarted","Data":"44a3e3a38d4172ea45ec0de5e0c1fd8215d92ee7ac1e04748754fee9fd62d6ef"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.084533 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" event={"ID":"5681fed3-74e3-4e40-beff-cebbe06023e4","Type":"ContainerStarted","Data":"4e82992859b5e72bbc0013c33a688f54f86bd212fbb604790e7ed3d5298f57b7"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.085914 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" event={"ID":"71138822-c6a1-4657-a640-9350e6e6965c","Type":"ContainerStarted","Data":"ea999199b7820c7100c5a9fc338876ccda75cfc0679f255a83de8393443ad0a4"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.087191 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" event={"ID":"10b627f1-74be-41c8-a7e7-367beb0a828d","Type":"ContainerStarted","Data":"adeb8b36f321be336d7376e7bfb803454f605082aabc733a3d5d295a9055d548"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.089775 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" event={"ID":"e5a81d4d-968e-43b0-b53a-e5c475773a29","Type":"ContainerStarted","Data":"d9acb934c3b297a37f9895fde1606bcace0dc246b9d410189cf64d62c96fa855"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.091896 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" event={"ID":"759e961c-957a-436b-80cd-14294fce30ad","Type":"ContainerStarted","Data":"59f2b4a14e9139c866f81ae54b8c40143b68dc5bdbdaebbcdb2c5278e28c0798"} Oct 09 19:43:39 crc kubenswrapper[4907]: E1009 19:43:39.092229 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vzz9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-646675d848-p5c55_openstack-operators(5681fed3-74e3-4e40-beff-cebbe06023e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.092896 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" event={"ID":"64d8141d-db49-44dd-90bc-20b75a642c99","Type":"ContainerStarted","Data":"8a89a75407b53264ea0e46ea9584d381199a160aeb178f9e35718b104ad963e7"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.093705 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" event={"ID":"364dc10d-b5b4-4c0e-a480-7dc371fc6a0d","Type":"ContainerStarted","Data":"23e896c0a3f5f0c18065956b95c9ce3612fdab61338c96f773d811be97074b8b"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.097379 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" event={"ID":"94e5bb04-8b14-4518-846b-721c24bc2348","Type":"ContainerStarted","Data":"0e2fb422cd8b5ca8ee781acfa1be7313bf8d0cadd2d391c9dfb103e066019f8d"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.098375 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" event={"ID":"ddffdb06-43eb-44db-9afa-a56e2c6b467c","Type":"ContainerStarted","Data":"58f6197272cac1912e36032c37129816910c049b084bced77d1c1e41247373f5"} Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.174402 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg"] Oct 09 19:43:39 crc kubenswrapper[4907]: I1009 19:43:39.180850 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6"] Oct 09 19:43:40 crc kubenswrapper[4907]: E1009 19:43:40.042922 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" podUID="5870b9a9-c7a2-4e57-b917-e5a41c20dc55" Oct 09 19:43:40 crc kubenswrapper[4907]: E1009 19:43:40.044075 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" podUID="5681fed3-74e3-4e40-beff-cebbe06023e4" Oct 09 19:43:40 crc kubenswrapper[4907]: E1009 19:43:40.045298 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" podUID="1313e0f0-b372-43cd-8f32-7c6bd566ab1a" Oct 09 19:43:40 crc kubenswrapper[4907]: E1009 19:43:40.049063 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" podUID="64d8141d-db49-44dd-90bc-20b75a642c99" Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.123557 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" event={"ID":"385f8c3a-5a7a-4214-a8cf-9c6886264ea9","Type":"ContainerStarted","Data":"0a9279f46792487abe6b6cd0322eb180b92a7aea887236d45aea318baad6244d"} Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.123593 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" event={"ID":"385f8c3a-5a7a-4214-a8cf-9c6886264ea9","Type":"ContainerStarted","Data":"7b477cae367626366ec004dc7353da94709a64b053ca11c8f265707e1f2d8aad"} Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.125034 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" event={"ID":"1313e0f0-b372-43cd-8f32-7c6bd566ab1a","Type":"ContainerStarted","Data":"f139fa00be110983b054fd790bc74fe0b3ba7678ec33ef7f26917a3df621bacb"} Oct 09 19:43:40 crc kubenswrapper[4907]: E1009 19:43:40.126831 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" podUID="1313e0f0-b372-43cd-8f32-7c6bd566ab1a" Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.127703 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" event={"ID":"523bbf58-dcf0-49f5-a198-24878c574c70","Type":"ContainerStarted","Data":"9e0e71271b4a6d73371960f233d2f3c35201db157f574f6d830939316cebbc13"} Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.135426 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" event={"ID":"5681fed3-74e3-4e40-beff-cebbe06023e4","Type":"ContainerStarted","Data":"9260e44d6d06511bf7956508217a25a5882da5f441b4cbe277759c1e3bb67337"} Oct 09 19:43:40 crc kubenswrapper[4907]: E1009 19:43:40.137967 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" podUID="5681fed3-74e3-4e40-beff-cebbe06023e4" Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.152423 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" event={"ID":"7150c799-4a61-4c14-9471-99fbc61a8f7b","Type":"ContainerStarted","Data":"6f9e65e1142c1bbb3a37d9dcafecda25781fa7d2ecdaa2ec9b14236080a9dfa7"} Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.154422 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" event={"ID":"5870b9a9-c7a2-4e57-b917-e5a41c20dc55","Type":"ContainerStarted","Data":"d03dd7efb3a843b259a7352d759ea313f7109f965603efa2aafe1e6d7d9c002a"} Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.154458 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" event={"ID":"5870b9a9-c7a2-4e57-b917-e5a41c20dc55","Type":"ContainerStarted","Data":"866b83f4b7d4873c5d8febe0473d806f085b0177ca9c9fc419bc323b0594ca67"} Oct 09 19:43:40 crc kubenswrapper[4907]: E1009 19:43:40.155623 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" podUID="5870b9a9-c7a2-4e57-b917-e5a41c20dc55" Oct 09 19:43:40 crc kubenswrapper[4907]: I1009 19:43:40.156443 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" event={"ID":"64d8141d-db49-44dd-90bc-20b75a642c99","Type":"ContainerStarted","Data":"f7b7d1c90a303ed704d2e56a9c8c6a48d844c863b7340660ccdf141b8231c37c"} Oct 09 19:43:40 crc kubenswrapper[4907]: E1009 19:43:40.157174 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" podUID="64d8141d-db49-44dd-90bc-20b75a642c99" Oct 09 19:43:41 crc kubenswrapper[4907]: I1009 19:43:41.170070 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" event={"ID":"385f8c3a-5a7a-4214-a8cf-9c6886264ea9","Type":"ContainerStarted","Data":"8048010fbca8ce097bc11b64d1a3341e27bfbe7910520064bbe6fdc1379035d4"} Oct 09 19:43:41 crc kubenswrapper[4907]: I1009 19:43:41.170183 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:41 crc kubenswrapper[4907]: E1009 19:43:41.172799 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" podUID="5870b9a9-c7a2-4e57-b917-e5a41c20dc55" Oct 09 19:43:41 crc kubenswrapper[4907]: E1009 19:43:41.172975 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" podUID="64d8141d-db49-44dd-90bc-20b75a642c99" Oct 09 19:43:41 crc kubenswrapper[4907]: E1009 19:43:41.173220 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" podUID="1313e0f0-b372-43cd-8f32-7c6bd566ab1a" Oct 09 19:43:41 crc kubenswrapper[4907]: E1009 19:43:41.174785 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" podUID="5681fed3-74e3-4e40-beff-cebbe06023e4" Oct 09 19:43:41 crc kubenswrapper[4907]: I1009 19:43:41.221809 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" podStartSLOduration=4.221766972 podStartE2EDuration="4.221766972s" podCreationTimestamp="2025-10-09 19:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:43:41.20963443 +0000 UTC m=+906.741601929" watchObservedRunningTime="2025-10-09 19:43:41.221766972 +0000 UTC m=+906.753734461" Oct 09 19:43:48 crc kubenswrapper[4907]: I1009 19:43:48.256647 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6d899f9cc7-5mhzg" Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.297321 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" event={"ID":"396b2bde-8328-4285-81a3-58d361096cf8","Type":"ContainerStarted","Data":"0984f5fc8486106eed811c53110dcfe2ac54137d5dc11cb9f8841e1ade59962e"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.297909 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" event={"ID":"396b2bde-8328-4285-81a3-58d361096cf8","Type":"ContainerStarted","Data":"36e40eaf8d92bb8fcf7fe602272d00c7c05a19c147afcd8ab2c7350d36fbacc8"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.297937 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.299848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" event={"ID":"1353b956-2119-4690-be09-9f9b788737a5","Type":"ContainerStarted","Data":"def8b2c6bf8611a3c30e573288cc97f334817b1bce65cdac984bf1a7f1b2d1af"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.304621 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" event={"ID":"ddffdb06-43eb-44db-9afa-a56e2c6b467c","Type":"ContainerStarted","Data":"84f03c01cc63d06c72060bf898f2ac8c5c28dcfaec6275d4bc53a821ab38d60a"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.311261 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5" event={"ID":"9ca2b641-57af-45b8-b0aa-3b45b08d13a7","Type":"ContainerStarted","Data":"f1d8c523eee91656d4cd51c90e2513aa881e76d31ecfb15c5256c0f9f5f72616"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.321437 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" podStartSLOduration=2.796530712 podStartE2EDuration="16.321412055s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:37.500184935 +0000 UTC m=+903.032152424" lastFinishedPulling="2025-10-09 19:43:51.025066278 +0000 UTC m=+916.557033767" observedRunningTime="2025-10-09 19:43:52.318071369 +0000 UTC m=+917.850038868" watchObservedRunningTime="2025-10-09 19:43:52.321412055 +0000 UTC m=+917.853379544" Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.327011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" event={"ID":"759e961c-957a-436b-80cd-14294fce30ad","Type":"ContainerStarted","Data":"3ec3feb5330a45ba32141435ad0a72696412728730edb9c2e0eda23c4f2f2702"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.336170 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" event={"ID":"9497c9e0-df89-48ae-be07-7df3e532bb35","Type":"ContainerStarted","Data":"a4e9ba2f1f93f95a9cf763d93c550a9c7a480ffda3779dfe58288c9f02627668"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.336226 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" event={"ID":"9497c9e0-df89-48ae-be07-7df3e532bb35","Type":"ContainerStarted","Data":"bdec41a8a953f30006b98b22393e26fa32f5916264d667cc8a1b248f7c25f411"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.336427 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.341853 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" event={"ID":"b1701060-cf14-4dfc-9545-5b63be29728a","Type":"ContainerStarted","Data":"c0fbc6f82c8e43c5574dd5eda0de0fdca94f0f9f688af56f7f9c59d3c5335339"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.356754 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" event={"ID":"94e5bb04-8b14-4518-846b-721c24bc2348","Type":"ContainerStarted","Data":"54f228a2dbac014ea091f0986d879f3071c5d6e8191946db9072c6b127306d95"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.363121 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" event={"ID":"364dc10d-b5b4-4c0e-a480-7dc371fc6a0d","Type":"ContainerStarted","Data":"1ae27448be76d41430fa8eac5020bfd44698d53fc550442f6c39853197aea285"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.367115 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" event={"ID":"8c17b476-94f3-4391-a755-e816a5ed56e0","Type":"ContainerStarted","Data":"cc273c3fe78bd5d370f529919263d6086e004b86b57d125bf18e2fe3f932cf58"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.368577 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" event={"ID":"7150c799-4a61-4c14-9471-99fbc61a8f7b","Type":"ContainerStarted","Data":"676e87bdba2fbfc65464e46c37647900e3ad9702ec9b66ec06d07d1f82546a59"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.372033 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" event={"ID":"cdc3d576-f3e6-4016-8856-ff8e5e6cf299","Type":"ContainerStarted","Data":"15fac3acb9ae53406337241130b8b55ba557c3f30e592482416e057b8ca91a5d"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.380025 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" event={"ID":"aa1daa5a-4e9e-4378-81ad-0dab2895f34a","Type":"ContainerStarted","Data":"03840b3735d1af1300b49eee3fc3d0cd82a8130a7b46ed0ba93444a821bd3b69"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.380020 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5" podStartSLOduration=3.348976738 podStartE2EDuration="15.380009134s" podCreationTimestamp="2025-10-09 19:43:37 +0000 UTC" firstStartedPulling="2025-10-09 19:43:38.997324406 +0000 UTC m=+904.529291895" lastFinishedPulling="2025-10-09 19:43:51.028356812 +0000 UTC m=+916.560324291" observedRunningTime="2025-10-09 19:43:52.349825257 +0000 UTC m=+917.881792756" watchObservedRunningTime="2025-10-09 19:43:52.380009134 +0000 UTC m=+917.911976623" Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.393420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" event={"ID":"71138822-c6a1-4657-a640-9350e6e6965c","Type":"ContainerStarted","Data":"f3b47ab52b5388ac95f29389e5ba7d5fcb2339fa6026962d0603bae596777304"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.419016 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" event={"ID":"10b627f1-74be-41c8-a7e7-367beb0a828d","Type":"ContainerStarted","Data":"5f70c5e0253ce8f6688f391adb0c46e70fb92aa44023491f0ba43663983a0fc8"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.419989 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.423863 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" event={"ID":"e5a81d4d-968e-43b0-b53a-e5c475773a29","Type":"ContainerStarted","Data":"fa5b3945a3eccbdd8a942225b317f374a8d8c4f21e73315106982797d11a0673"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.427630 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" event={"ID":"e51f1489-6999-474e-9ae4-5f8598e608d0","Type":"ContainerStarted","Data":"39c166353c4b2d1a961371eaced606771db7f11473e6e40a3a3933c063ad3cf0"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.434939 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" event={"ID":"523bbf58-dcf0-49f5-a198-24878c574c70","Type":"ContainerStarted","Data":"f556ddb2230311b6cd4d1121d03fe88202e3da4c3e7491334f70246325b334b6"} Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.438362 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" podStartSLOduration=4.043708353 podStartE2EDuration="16.438347537s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:38.631098321 +0000 UTC m=+904.163065810" lastFinishedPulling="2025-10-09 19:43:51.025737505 +0000 UTC m=+916.557704994" observedRunningTime="2025-10-09 19:43:52.437196048 +0000 UTC m=+917.969163547" watchObservedRunningTime="2025-10-09 19:43:52.438347537 +0000 UTC m=+917.970315026" Oct 09 19:43:52 crc kubenswrapper[4907]: I1009 19:43:52.441870 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" podStartSLOduration=4.153395467 podStartE2EDuration="16.441844257s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:38.630665769 +0000 UTC m=+904.162633258" lastFinishedPulling="2025-10-09 19:43:50.919114559 +0000 UTC m=+916.451082048" observedRunningTime="2025-10-09 19:43:52.380345973 +0000 UTC m=+917.912313462" watchObservedRunningTime="2025-10-09 19:43:52.441844257 +0000 UTC m=+917.973811746" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.444436 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" event={"ID":"10b627f1-74be-41c8-a7e7-367beb0a828d","Type":"ContainerStarted","Data":"4bb0c2ef417e1ebd91262222592788ead9bfc0ea93cd108399a2b195c6210a91"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.446420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" event={"ID":"e5a81d4d-968e-43b0-b53a-e5c475773a29","Type":"ContainerStarted","Data":"ca6a89c873d3a6ff1ef69b3979e6c8c6aa22ffc0eaaf09b1f7a27f24e0741b82"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.447057 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.448601 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" event={"ID":"e51f1489-6999-474e-9ae4-5f8598e608d0","Type":"ContainerStarted","Data":"879a357020f792f094610d64d797d80275a2c0ccfb8fbc172fb73b9933422856"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.448791 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.450485 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" event={"ID":"759e961c-957a-436b-80cd-14294fce30ad","Type":"ContainerStarted","Data":"39ce5bc37a83d778f06f5975c50d944da23eea8748584cdc21455b2af887411d"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.452418 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" event={"ID":"94e5bb04-8b14-4518-846b-721c24bc2348","Type":"ContainerStarted","Data":"f0256cbc50c3f9de24759df70a9102dcd9e71fda8710d39d3e58bf288844e124"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.452521 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.454341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" event={"ID":"ddffdb06-43eb-44db-9afa-a56e2c6b467c","Type":"ContainerStarted","Data":"6e442f5841e798ef1dba9c3e5e8f8e0bd0329dc820801a724e1616d4507c634d"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.454395 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.456687 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" event={"ID":"364dc10d-b5b4-4c0e-a480-7dc371fc6a0d","Type":"ContainerStarted","Data":"3da23afb625e84254fbdeabc715c122ed7c6b554b01f373e09d7d3ebdf9fc0ec"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.457052 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.458880 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" event={"ID":"b1701060-cf14-4dfc-9545-5b63be29728a","Type":"ContainerStarted","Data":"15e067a6471905e081904d86cfe2a1856090f1aa8b69e9c5d14ac8b12617d689"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.459028 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.460429 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" event={"ID":"7150c799-4a61-4c14-9471-99fbc61a8f7b","Type":"ContainerStarted","Data":"e99e4629aba51e31a60f29bafb8080e30039b1215e04cd4426e0d7955fb83acd"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.460562 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.462281 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" event={"ID":"aa1daa5a-4e9e-4378-81ad-0dab2895f34a","Type":"ContainerStarted","Data":"7da2d0bd74a55fbd2dfdc906aebb78350c1f467a12ba98dc5508dd1058aba95f"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.462394 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.463856 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" event={"ID":"71138822-c6a1-4657-a640-9350e6e6965c","Type":"ContainerStarted","Data":"d0a422617dc3ed514649595d864a529b40c04df4a3462f6ce8dcf629db537703"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.463972 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.465914 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" event={"ID":"cdc3d576-f3e6-4016-8856-ff8e5e6cf299","Type":"ContainerStarted","Data":"2d8720b862b75d598170a4946099defdfacab5941eb2b2d373c09cb1aabfdd76"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.466030 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.472392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" event={"ID":"1353b956-2119-4690-be09-9f9b788737a5","Type":"ContainerStarted","Data":"eb990cd6380475253bbf94dc98059e2e05c3f02c72ab37de287b6ecefe5d75a9"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.472518 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.477654 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" event={"ID":"523bbf58-dcf0-49f5-a198-24878c574c70","Type":"ContainerStarted","Data":"bb57858d793de100499fa9bcee4c6513f100b1a6b94f635f2a49fb2b943587ec"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.478246 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.479008 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" podStartSLOduration=4.775723371 podStartE2EDuration="17.478987567s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:38.324675766 +0000 UTC m=+903.856643255" lastFinishedPulling="2025-10-09 19:43:51.027939962 +0000 UTC m=+916.559907451" observedRunningTime="2025-10-09 19:43:53.467053029 +0000 UTC m=+918.999020528" watchObservedRunningTime="2025-10-09 19:43:53.478987567 +0000 UTC m=+919.010955076" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.480771 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" event={"ID":"8c17b476-94f3-4391-a755-e816a5ed56e0","Type":"ContainerStarted","Data":"ab6a7f475e19542a76977907c8fc0d5d356ca3304aeb7b65b9f63a64dd8ddcb1"} Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.495927 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" podStartSLOduration=5.256772165 podStartE2EDuration="17.495908453s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:38.790069336 +0000 UTC m=+904.322036825" lastFinishedPulling="2025-10-09 19:43:51.029205624 +0000 UTC m=+916.561173113" observedRunningTime="2025-10-09 19:43:53.485428383 +0000 UTC m=+919.017395892" watchObservedRunningTime="2025-10-09 19:43:53.495908453 +0000 UTC m=+919.027875942" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.511950 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" podStartSLOduration=5.261833176 podStartE2EDuration="17.511931656s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:38.776886837 +0000 UTC m=+904.308854326" lastFinishedPulling="2025-10-09 19:43:51.026985317 +0000 UTC m=+916.558952806" observedRunningTime="2025-10-09 19:43:53.510227712 +0000 UTC m=+919.042195231" watchObservedRunningTime="2025-10-09 19:43:53.511931656 +0000 UTC m=+919.043899145" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.529917 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" podStartSLOduration=5.21240245 podStartE2EDuration="17.529895008s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:38.631316486 +0000 UTC m=+904.163283975" lastFinishedPulling="2025-10-09 19:43:50.948809044 +0000 UTC m=+916.480776533" observedRunningTime="2025-10-09 19:43:53.529785416 +0000 UTC m=+919.061752915" watchObservedRunningTime="2025-10-09 19:43:53.529895008 +0000 UTC m=+919.061862497" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.546619 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" podStartSLOduration=5.538623916 podStartE2EDuration="17.546605979s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:39.022842383 +0000 UTC m=+904.554809872" lastFinishedPulling="2025-10-09 19:43:51.030824446 +0000 UTC m=+916.562791935" observedRunningTime="2025-10-09 19:43:53.544285489 +0000 UTC m=+919.076252988" watchObservedRunningTime="2025-10-09 19:43:53.546605979 +0000 UTC m=+919.078573468" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.560660 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" podStartSLOduration=4.2462131 podStartE2EDuration="17.560640651s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:37.707619909 +0000 UTC m=+903.239587388" lastFinishedPulling="2025-10-09 19:43:51.02204745 +0000 UTC m=+916.554014939" observedRunningTime="2025-10-09 19:43:53.55712694 +0000 UTC m=+919.089094439" watchObservedRunningTime="2025-10-09 19:43:53.560640651 +0000 UTC m=+919.092608150" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.576400 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" podStartSLOduration=4.322614108 podStartE2EDuration="17.576376506s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:37.776621237 +0000 UTC m=+903.308588726" lastFinishedPulling="2025-10-09 19:43:51.030383635 +0000 UTC m=+916.562351124" observedRunningTime="2025-10-09 19:43:53.573073301 +0000 UTC m=+919.105040800" watchObservedRunningTime="2025-10-09 19:43:53.576376506 +0000 UTC m=+919.108344005" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.610056 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" podStartSLOduration=5.628156602 podStartE2EDuration="17.610040663s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:39.048110464 +0000 UTC m=+904.580077953" lastFinishedPulling="2025-10-09 19:43:51.029994525 +0000 UTC m=+916.561962014" observedRunningTime="2025-10-09 19:43:53.608590926 +0000 UTC m=+919.140558435" watchObservedRunningTime="2025-10-09 19:43:53.610040663 +0000 UTC m=+919.142008152" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.611425 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" podStartSLOduration=4.917374442 podStartE2EDuration="17.611418589s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:38.31859722 +0000 UTC m=+903.850564699" lastFinishedPulling="2025-10-09 19:43:51.012641357 +0000 UTC m=+916.544608846" observedRunningTime="2025-10-09 19:43:53.588873608 +0000 UTC m=+919.120841127" watchObservedRunningTime="2025-10-09 19:43:53.611418589 +0000 UTC m=+919.143386078" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.625259 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" podStartSLOduration=4.15651057 podStartE2EDuration="17.625239215s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:37.558662682 +0000 UTC m=+903.090630171" lastFinishedPulling="2025-10-09 19:43:51.027391337 +0000 UTC m=+916.559358816" observedRunningTime="2025-10-09 19:43:53.621122789 +0000 UTC m=+919.153090298" watchObservedRunningTime="2025-10-09 19:43:53.625239215 +0000 UTC m=+919.157206714" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.641813 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" podStartSLOduration=5.6590882 podStartE2EDuration="17.641790801s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:39.073872248 +0000 UTC m=+904.605839737" lastFinishedPulling="2025-10-09 19:43:51.056574859 +0000 UTC m=+916.588542338" observedRunningTime="2025-10-09 19:43:53.637655825 +0000 UTC m=+919.169623314" watchObservedRunningTime="2025-10-09 19:43:53.641790801 +0000 UTC m=+919.173758290" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.661357 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" podStartSLOduration=5.811810654 podStartE2EDuration="17.661344305s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:39.1768345 +0000 UTC m=+904.708801989" lastFinishedPulling="2025-10-09 19:43:51.026368151 +0000 UTC m=+916.558335640" observedRunningTime="2025-10-09 19:43:53.65765717 +0000 UTC m=+919.189624669" watchObservedRunningTime="2025-10-09 19:43:53.661344305 +0000 UTC m=+919.193311794" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.674952 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" podStartSLOduration=4.613234305 podStartE2EDuration="17.674930875s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:37.967917715 +0000 UTC m=+903.499885204" lastFinishedPulling="2025-10-09 19:43:51.029614285 +0000 UTC m=+916.561581774" observedRunningTime="2025-10-09 19:43:53.671550628 +0000 UTC m=+919.203518127" watchObservedRunningTime="2025-10-09 19:43:53.674930875 +0000 UTC m=+919.206898364" Oct 09 19:43:53 crc kubenswrapper[4907]: I1009 19:43:53.690667 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" podStartSLOduration=4.410447461 podStartE2EDuration="17.69064791s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:37.63892383 +0000 UTC m=+903.170891319" lastFinishedPulling="2025-10-09 19:43:50.919124289 +0000 UTC m=+916.451091768" observedRunningTime="2025-10-09 19:43:53.688028122 +0000 UTC m=+919.219995621" watchObservedRunningTime="2025-10-09 19:43:53.69064791 +0000 UTC m=+919.222615399" Oct 09 19:43:54 crc kubenswrapper[4907]: I1009 19:43:54.487747 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" Oct 09 19:43:54 crc kubenswrapper[4907]: I1009 19:43:54.487785 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" Oct 09 19:43:55 crc kubenswrapper[4907]: I1009 19:43:55.499958 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" event={"ID":"5870b9a9-c7a2-4e57-b917-e5a41c20dc55","Type":"ContainerStarted","Data":"1058273fc81a8983a47565898d568c7c6c5c3ad71cd18a424292adf38c030297"} Oct 09 19:43:55 crc kubenswrapper[4907]: I1009 19:43:55.500527 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:43:55 crc kubenswrapper[4907]: I1009 19:43:55.502415 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" event={"ID":"64d8141d-db49-44dd-90bc-20b75a642c99","Type":"ContainerStarted","Data":"364135450b740f07b04be555f977ae99a369aa8949541cca50b19e3f3467b2f2"} Oct 09 19:43:55 crc kubenswrapper[4907]: I1009 19:43:55.530013 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" podStartSLOduration=3.554959572 podStartE2EDuration="19.529990935s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:39.074112584 +0000 UTC m=+904.606080063" lastFinishedPulling="2025-10-09 19:43:55.049143897 +0000 UTC m=+920.581111426" observedRunningTime="2025-10-09 19:43:55.525596732 +0000 UTC m=+921.057564241" watchObservedRunningTime="2025-10-09 19:43:55.529990935 +0000 UTC m=+921.061958424" Oct 09 19:43:56 crc kubenswrapper[4907]: I1009 19:43:56.709040 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-jzjwr" Oct 09 19:43:56 crc kubenswrapper[4907]: I1009 19:43:56.736887 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" podStartSLOduration=4.758531559 podStartE2EDuration="20.736855677s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:39.074178916 +0000 UTC m=+904.606146405" lastFinishedPulling="2025-10-09 19:43:55.052503034 +0000 UTC m=+920.584470523" observedRunningTime="2025-10-09 19:43:55.549120448 +0000 UTC m=+921.081087947" watchObservedRunningTime="2025-10-09 19:43:56.736855677 +0000 UTC m=+922.268823206" Oct 09 19:43:56 crc kubenswrapper[4907]: I1009 19:43:56.759300 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-5pcmk" Oct 09 19:43:56 crc kubenswrapper[4907]: I1009 19:43:56.828610 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-5d6rr" Oct 09 19:43:56 crc kubenswrapper[4907]: I1009 19:43:56.865843 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-tpg5p" Oct 09 19:43:56 crc kubenswrapper[4907]: I1009 19:43:56.953831 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-d2zsg" Oct 09 19:43:56 crc kubenswrapper[4907]: I1009 19:43:56.986210 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-595rv" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.088366 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-khm2s" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.105200 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hjhwf" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.116520 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-jnbdz" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.146750 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-q2flj" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.215696 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-t46vt" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.225989 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-4bsn7" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.235218 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-z5klh" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.347606 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.406953 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-gbxbh" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.422114 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-8ztd5" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.451273 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-895c94468-xtfng" Oct 09 19:43:57 crc kubenswrapper[4907]: I1009 19:43:57.849066 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6" Oct 09 19:43:59 crc kubenswrapper[4907]: I1009 19:43:59.535098 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" event={"ID":"1313e0f0-b372-43cd-8f32-7c6bd566ab1a","Type":"ContainerStarted","Data":"d71b1dcf9883648c20f57455810a1600f1062ae9db28becdec07e7eca9a00c2c"} Oct 09 19:43:59 crc kubenswrapper[4907]: I1009 19:43:59.536955 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" Oct 09 19:43:59 crc kubenswrapper[4907]: I1009 19:43:59.539009 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" event={"ID":"5681fed3-74e3-4e40-beff-cebbe06023e4","Type":"ContainerStarted","Data":"482c6d91e73d2f246d81bb4df199621d549484ec65411385a067376b87e76ead"} Oct 09 19:43:59 crc kubenswrapper[4907]: I1009 19:43:59.539819 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" Oct 09 19:43:59 crc kubenswrapper[4907]: I1009 19:43:59.559969 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" podStartSLOduration=3.540687544 podStartE2EDuration="23.559948136s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:39.07435624 +0000 UTC m=+904.606323729" lastFinishedPulling="2025-10-09 19:43:59.093616792 +0000 UTC m=+924.625584321" observedRunningTime="2025-10-09 19:43:59.557174345 +0000 UTC m=+925.089141924" watchObservedRunningTime="2025-10-09 19:43:59.559948136 +0000 UTC m=+925.091915645" Oct 09 19:43:59 crc kubenswrapper[4907]: I1009 19:43:59.578121 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" podStartSLOduration=3.541344941 podStartE2EDuration="23.578102804s" podCreationTimestamp="2025-10-09 19:43:36 +0000 UTC" firstStartedPulling="2025-10-09 19:43:39.091985604 +0000 UTC m=+904.623953103" lastFinishedPulling="2025-10-09 19:43:59.128743427 +0000 UTC m=+924.660710966" observedRunningTime="2025-10-09 19:43:59.576258197 +0000 UTC m=+925.108225716" watchObservedRunningTime="2025-10-09 19:43:59.578102804 +0000 UTC m=+925.110070293" Oct 09 19:44:06 crc kubenswrapper[4907]: I1009 19:44:06.299170 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:44:06 crc kubenswrapper[4907]: I1009 19:44:06.300665 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:44:06 crc kubenswrapper[4907]: I1009 19:44:06.300781 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:44:06 crc kubenswrapper[4907]: I1009 19:44:06.301898 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e52c7a0fe32a558feb0415aa3280260b781b94dc2a29de298da06be1d8aa2d54"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 19:44:06 crc kubenswrapper[4907]: I1009 19:44:06.302031 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://e52c7a0fe32a558feb0415aa3280260b781b94dc2a29de298da06be1d8aa2d54" gracePeriod=600 Oct 09 19:44:06 crc kubenswrapper[4907]: I1009 19:44:06.604532 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="e52c7a0fe32a558feb0415aa3280260b781b94dc2a29de298da06be1d8aa2d54" exitCode=0 Oct 09 19:44:06 crc kubenswrapper[4907]: I1009 19:44:06.604595 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"e52c7a0fe32a558feb0415aa3280260b781b94dc2a29de298da06be1d8aa2d54"} Oct 09 19:44:06 crc kubenswrapper[4907]: I1009 19:44:06.604649 4907 scope.go:117] "RemoveContainer" containerID="9652a7dfb693b946f43ed7007125b8bc1aa6768f8074819278bd9dc415f2d69d" Oct 09 19:44:07 crc kubenswrapper[4907]: I1009 19:44:07.338183 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-zwj6t" Oct 09 19:44:07 crc kubenswrapper[4907]: I1009 19:44:07.350663 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-v87t7" Oct 09 19:44:07 crc kubenswrapper[4907]: I1009 19:44:07.484965 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-9g4cj" Oct 09 19:44:07 crc kubenswrapper[4907]: I1009 19:44:07.587369 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-646675d848-p5c55" Oct 09 19:44:11 crc kubenswrapper[4907]: I1009 19:44:11.647892 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"9461f8fa6da50e0e37d8d2c88aee594214386d7c074bf0b7db5d5d79f7d078a8"} Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.449812 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8fwdq"] Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.451594 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.453627 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4cb8p" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.454163 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.455007 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.455215 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.477092 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8fwdq"] Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.510222 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rhwmc"] Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.511985 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.514095 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.528895 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rhwmc"] Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.585960 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhlvn\" (UniqueName: \"kubernetes.io/projected/e8a61e71-59e4-4060-90c7-5b0b634b8160-kube-api-access-bhlvn\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.586019 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-config\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.586098 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.586140 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae8ffa5-4baf-4717-a82a-4357b24aac3c-config\") pod \"dnsmasq-dns-675f4bcbfc-8fwdq\" (UID: \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.586197 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkg5g\" (UniqueName: \"kubernetes.io/projected/cae8ffa5-4baf-4717-a82a-4357b24aac3c-kube-api-access-rkg5g\") pod \"dnsmasq-dns-675f4bcbfc-8fwdq\" (UID: \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.687644 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkg5g\" (UniqueName: \"kubernetes.io/projected/cae8ffa5-4baf-4717-a82a-4357b24aac3c-kube-api-access-rkg5g\") pod \"dnsmasq-dns-675f4bcbfc-8fwdq\" (UID: \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.687716 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhlvn\" (UniqueName: \"kubernetes.io/projected/e8a61e71-59e4-4060-90c7-5b0b634b8160-kube-api-access-bhlvn\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.687752 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-config\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.687790 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.687814 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae8ffa5-4baf-4717-a82a-4357b24aac3c-config\") pod \"dnsmasq-dns-675f4bcbfc-8fwdq\" (UID: \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.688745 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae8ffa5-4baf-4717-a82a-4357b24aac3c-config\") pod \"dnsmasq-dns-675f4bcbfc-8fwdq\" (UID: \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.688759 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.689382 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-config\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.706519 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkg5g\" (UniqueName: \"kubernetes.io/projected/cae8ffa5-4baf-4717-a82a-4357b24aac3c-kube-api-access-rkg5g\") pod \"dnsmasq-dns-675f4bcbfc-8fwdq\" (UID: \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.709593 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhlvn\" (UniqueName: \"kubernetes.io/projected/e8a61e71-59e4-4060-90c7-5b0b634b8160-kube-api-access-bhlvn\") pod \"dnsmasq-dns-78dd6ddcc-rhwmc\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.772131 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:26 crc kubenswrapper[4907]: I1009 19:44:26.831820 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:27 crc kubenswrapper[4907]: I1009 19:44:27.213337 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8fwdq"] Oct 09 19:44:27 crc kubenswrapper[4907]: I1009 19:44:27.301081 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rhwmc"] Oct 09 19:44:27 crc kubenswrapper[4907]: I1009 19:44:27.813359 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" event={"ID":"e8a61e71-59e4-4060-90c7-5b0b634b8160","Type":"ContainerStarted","Data":"897d769315fdd4f829d308472841536dea4c86930a6d0a884f6cf2d060860a73"} Oct 09 19:44:27 crc kubenswrapper[4907]: I1009 19:44:27.815385 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" event={"ID":"cae8ffa5-4baf-4717-a82a-4357b24aac3c","Type":"ContainerStarted","Data":"04b889ca8aca502d3a1809f556efbd1ff0a9a9314e4b2612592b437bc757332a"} Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.400700 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8fwdq"] Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.420176 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9q9xw"] Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.421326 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.443724 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9q9xw"] Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.526114 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.526248 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-config\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.526271 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8tld\" (UniqueName: \"kubernetes.io/projected/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-kube-api-access-w8tld\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.627656 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.627761 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-config\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.627787 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8tld\" (UniqueName: \"kubernetes.io/projected/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-kube-api-access-w8tld\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.628748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.628769 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-config\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.657787 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8tld\" (UniqueName: \"kubernetes.io/projected/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-kube-api-access-w8tld\") pod \"dnsmasq-dns-666b6646f7-9q9xw\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.673050 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rhwmc"] Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.711785 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j7zr6"] Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.713774 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.726917 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j7zr6"] Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.752967 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.833516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qssz\" (UniqueName: \"kubernetes.io/projected/eb8b4400-e16b-45de-b19a-0d312e4f1e51-kube-api-access-9qssz\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.833583 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-config\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.833646 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.936448 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qssz\" (UniqueName: \"kubernetes.io/projected/eb8b4400-e16b-45de-b19a-0d312e4f1e51-kube-api-access-9qssz\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.937187 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-config\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.937296 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.938493 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-config\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.938591 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:29 crc kubenswrapper[4907]: I1009 19:44:29.962937 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qssz\" (UniqueName: \"kubernetes.io/projected/eb8b4400-e16b-45de-b19a-0d312e4f1e51-kube-api-access-9qssz\") pod \"dnsmasq-dns-57d769cc4f-j7zr6\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.041549 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.241671 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9q9xw"] Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.557034 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.559183 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.562864 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.563178 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5v9sj" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.563354 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.563500 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.563522 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.563547 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.563906 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.572179 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.579913 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j7zr6"] Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649373 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649403 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649434 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649571 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649604 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dn4c\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-kube-api-access-8dn4c\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649633 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649657 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649717 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.649761 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751052 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751097 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751146 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751190 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751226 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751246 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751302 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751325 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751341 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dn4c\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-kube-api-access-8dn4c\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751361 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.751377 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.752061 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.752203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.752628 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.752646 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.753092 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.753130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.755412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.755826 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.756485 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.758274 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.770573 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dn4c\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-kube-api-access-8dn4c\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.776029 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.845904 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" event={"ID":"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68","Type":"ContainerStarted","Data":"3232b619bf623012af9af8902b4081e0616c3fc1bb73dfef61a2056c8ddcbc7c"} Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.848491 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.850726 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.854407 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.854756 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.855651 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.855827 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.856015 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hprcd" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.855952 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.858010 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.866479 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.884401 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953339 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsr5n\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-kube-api-access-jsr5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953393 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953439 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953537 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05cb258e-fa1a-4978-b143-d6c817ec0f96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953563 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05cb258e-fa1a-4978-b143-d6c817ec0f96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953592 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953638 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953656 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953698 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:30 crc kubenswrapper[4907]: I1009 19:44:30.953719 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055134 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05cb258e-fa1a-4978-b143-d6c817ec0f96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055244 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05cb258e-fa1a-4978-b143-d6c817ec0f96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055267 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055285 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055319 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055353 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055392 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsr5n\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-kube-api-access-jsr5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055407 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055435 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.055946 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.056969 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.057873 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.058833 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.059843 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.060657 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.064072 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.069357 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05cb258e-fa1a-4978-b143-d6c817ec0f96-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.069957 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05cb258e-fa1a-4978-b143-d6c817ec0f96-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.070121 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.081023 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsr5n\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-kube-api-access-jsr5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.089743 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:31 crc kubenswrapper[4907]: I1009 19:44:31.173574 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.452609 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.455397 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.459130 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.460106 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.461034 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.463609 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.464165 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8jflb" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.468161 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.477433 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490179 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-secrets\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490245 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9tpc\" (UniqueName: \"kubernetes.io/projected/892437ea-977d-434a-ba03-2ce726fb21b0-kube-api-access-x9tpc\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490339 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490397 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490420 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-kolla-config\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/892437ea-977d-434a-ba03-2ce726fb21b0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490537 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-config-data-default\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490590 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.490631 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.591282 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/892437ea-977d-434a-ba03-2ce726fb21b0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.592896 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-config-data-default\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.594355 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.596320 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.596802 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-secrets\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.597698 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9tpc\" (UniqueName: \"kubernetes.io/projected/892437ea-977d-434a-ba03-2ce726fb21b0-kube-api-access-x9tpc\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.597850 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.598010 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.598160 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-kolla-config\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.591926 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/892437ea-977d-434a-ba03-2ce726fb21b0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.594258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-config-data-default\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.596738 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.596235 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.599017 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/892437ea-977d-434a-ba03-2ce726fb21b0-kolla-config\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.602428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-secrets\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.604425 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.618942 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892437ea-977d-434a-ba03-2ce726fb21b0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.619890 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9tpc\" (UniqueName: \"kubernetes.io/projected/892437ea-977d-434a-ba03-2ce726fb21b0-kube-api-access-x9tpc\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.625556 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"892437ea-977d-434a-ba03-2ce726fb21b0\") " pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: W1009 19:44:33.769143 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb8b4400_e16b_45de_b19a_0d312e4f1e51.slice/crio-c6f5d8934e48af883764ed2268d945208e2234964672f2008ddbe0503eaf14cb WatchSource:0}: Error finding container c6f5d8934e48af883764ed2268d945208e2234964672f2008ddbe0503eaf14cb: Status 404 returned error can't find the container with id c6f5d8934e48af883764ed2268d945208e2234964672f2008ddbe0503eaf14cb Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.787124 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.788236 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.789664 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.794924 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.801066 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-vskc8" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.801405 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.801663 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.802010 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.871808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" event={"ID":"eb8b4400-e16b-45de-b19a-0d312e4f1e51","Type":"ContainerStarted","Data":"c6f5d8934e48af883764ed2268d945208e2234964672f2008ddbe0503eaf14cb"} Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.904336 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.904730 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.904869 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92026239-8122-4224-ae55-be69f2c42a77-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.905041 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.905154 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.905271 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.905391 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2ct\" (UniqueName: \"kubernetes.io/projected/92026239-8122-4224-ae55-be69f2c42a77-kube-api-access-6m2ct\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.905631 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:33 crc kubenswrapper[4907]: I1009 19:44:33.905736 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.007684 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92026239-8122-4224-ae55-be69f2c42a77-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.007802 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.007831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.008222 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92026239-8122-4224-ae55-be69f2c42a77-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.008570 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.008601 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.008645 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2ct\" (UniqueName: \"kubernetes.io/projected/92026239-8122-4224-ae55-be69f2c42a77-kube-api-access-6m2ct\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.008724 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.008751 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.009006 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.009015 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.009272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.009316 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.009586 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92026239-8122-4224-ae55-be69f2c42a77-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.013763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.014045 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.030428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2ct\" (UniqueName: \"kubernetes.io/projected/92026239-8122-4224-ae55-be69f2c42a77-kube-api-access-6m2ct\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.045230 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92026239-8122-4224-ae55-be69f2c42a77-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.049239 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"92026239-8122-4224-ae55-be69f2c42a77\") " pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.125811 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.133811 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.135349 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.136158 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.136451 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vtlkt" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.143824 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.145429 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.312107 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51120f59-aafb-4105-b919-fe8e4fc20f93-config-data\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.312400 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqvb\" (UniqueName: \"kubernetes.io/projected/51120f59-aafb-4105-b919-fe8e4fc20f93-kube-api-access-jzqvb\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.312417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51120f59-aafb-4105-b919-fe8e4fc20f93-kolla-config\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.312534 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/51120f59-aafb-4105-b919-fe8e4fc20f93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.312555 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51120f59-aafb-4105-b919-fe8e4fc20f93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.414053 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51120f59-aafb-4105-b919-fe8e4fc20f93-config-data\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.414107 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqvb\" (UniqueName: \"kubernetes.io/projected/51120f59-aafb-4105-b919-fe8e4fc20f93-kube-api-access-jzqvb\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.414129 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51120f59-aafb-4105-b919-fe8e4fc20f93-kolla-config\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.414221 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/51120f59-aafb-4105-b919-fe8e4fc20f93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.414240 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51120f59-aafb-4105-b919-fe8e4fc20f93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.415018 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51120f59-aafb-4105-b919-fe8e4fc20f93-kolla-config\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.415154 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51120f59-aafb-4105-b919-fe8e4fc20f93-config-data\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.419818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/51120f59-aafb-4105-b919-fe8e4fc20f93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.420590 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51120f59-aafb-4105-b919-fe8e4fc20f93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.432955 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqvb\" (UniqueName: \"kubernetes.io/projected/51120f59-aafb-4105-b919-fe8e4fc20f93-kube-api-access-jzqvb\") pod \"memcached-0\" (UID: \"51120f59-aafb-4105-b919-fe8e4fc20f93\") " pod="openstack/memcached-0" Oct 09 19:44:34 crc kubenswrapper[4907]: I1009 19:44:34.464486 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 19:44:35 crc kubenswrapper[4907]: I1009 19:44:35.661430 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:44:35 crc kubenswrapper[4907]: I1009 19:44:35.662711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 19:44:35 crc kubenswrapper[4907]: I1009 19:44:35.664887 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-t772w" Oct 09 19:44:35 crc kubenswrapper[4907]: I1009 19:44:35.673917 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:44:35 crc kubenswrapper[4907]: I1009 19:44:35.839053 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpmtg\" (UniqueName: \"kubernetes.io/projected/553f4f9e-5f34-4731-898c-4f0cacf4b545-kube-api-access-kpmtg\") pod \"kube-state-metrics-0\" (UID: \"553f4f9e-5f34-4731-898c-4f0cacf4b545\") " pod="openstack/kube-state-metrics-0" Oct 09 19:44:35 crc kubenswrapper[4907]: I1009 19:44:35.941140 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpmtg\" (UniqueName: \"kubernetes.io/projected/553f4f9e-5f34-4731-898c-4f0cacf4b545-kube-api-access-kpmtg\") pod \"kube-state-metrics-0\" (UID: \"553f4f9e-5f34-4731-898c-4f0cacf4b545\") " pod="openstack/kube-state-metrics-0" Oct 09 19:44:35 crc kubenswrapper[4907]: I1009 19:44:35.963329 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpmtg\" (UniqueName: \"kubernetes.io/projected/553f4f9e-5f34-4731-898c-4f0cacf4b545-kube-api-access-kpmtg\") pod \"kube-state-metrics-0\" (UID: \"553f4f9e-5f34-4731-898c-4f0cacf4b545\") " pod="openstack/kube-state-metrics-0" Oct 09 19:44:36 crc kubenswrapper[4907]: I1009 19:44:36.028069 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.800111 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dz7f2"] Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.801686 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.804943 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-k5sv2" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.805043 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.806616 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.810454 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2"] Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.862907 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9z259"] Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.864419 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.875622 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9z259"] Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.908758 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjqx4\" (UniqueName: \"kubernetes.io/projected/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-kube-api-access-zjqx4\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.908823 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-scripts\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.908851 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-combined-ca-bundle\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.908873 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-run\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.908897 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-run-ovn\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.908928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-ovn-controller-tls-certs\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:39 crc kubenswrapper[4907]: I1009 19:44:39.908959 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-log-ovn\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010367 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-run\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010423 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-run\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010460 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-run-ovn\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010490 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-scripts\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010576 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-ovn-controller-tls-certs\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010659 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-lib\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010684 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-log-ovn\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010716 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mqdx\" (UniqueName: \"kubernetes.io/projected/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-kube-api-access-4mqdx\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010753 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-etc-ovs\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010790 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-log\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.010818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjqx4\" (UniqueName: \"kubernetes.io/projected/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-kube-api-access-zjqx4\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.011013 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-scripts\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.011074 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-combined-ca-bundle\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.011686 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-log-ovn\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.011763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-run-ovn\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.012134 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-var-run\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.013026 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-scripts\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.016616 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-ovn-controller-tls-certs\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.026479 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-combined-ca-bundle\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.027492 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjqx4\" (UniqueName: \"kubernetes.io/projected/c9bf7943-cd49-4a26-83e2-9efc4c9dcc02-kube-api-access-zjqx4\") pod \"ovn-controller-dz7f2\" (UID: \"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02\") " pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-lib\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112334 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mqdx\" (UniqueName: \"kubernetes.io/projected/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-kube-api-access-4mqdx\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112368 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-etc-ovs\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112395 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-log\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112446 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-run\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112476 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-scripts\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112575 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-lib\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112686 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-run\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.112705 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-var-log\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.113213 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-etc-ovs\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.114639 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-scripts\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.124144 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.127854 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mqdx\" (UniqueName: \"kubernetes.io/projected/c0f67e81-b9c9-419e-bc68-dcc44ac15f4d-kube-api-access-4mqdx\") pod \"ovn-controller-ovs-9z259\" (UID: \"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d\") " pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.178992 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.268723 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.270336 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.272146 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.272173 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.272190 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gpxsv" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.272744 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.272855 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.289000 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.417200 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c18daae-d993-4ac8-954d-f8c38cacedd1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.417260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v84sn\" (UniqueName: \"kubernetes.io/projected/9c18daae-d993-4ac8-954d-f8c38cacedd1-kube-api-access-v84sn\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.417314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c18daae-d993-4ac8-954d-f8c38cacedd1-config\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.417630 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.417684 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c18daae-d993-4ac8-954d-f8c38cacedd1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.417815 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.417890 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.417995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.524437 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c18daae-d993-4ac8-954d-f8c38cacedd1-config\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.524576 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.524601 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c18daae-d993-4ac8-954d-f8c38cacedd1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.524639 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.524667 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.524698 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.524722 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c18daae-d993-4ac8-954d-f8c38cacedd1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.524751 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v84sn\" (UniqueName: \"kubernetes.io/projected/9c18daae-d993-4ac8-954d-f8c38cacedd1-kube-api-access-v84sn\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.525374 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c18daae-d993-4ac8-954d-f8c38cacedd1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.525834 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.525954 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c18daae-d993-4ac8-954d-f8c38cacedd1-config\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.526574 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c18daae-d993-4ac8-954d-f8c38cacedd1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.529511 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.537040 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.549718 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v84sn\" (UniqueName: \"kubernetes.io/projected/9c18daae-d993-4ac8-954d-f8c38cacedd1-kube-api-access-v84sn\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.553839 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c18daae-d993-4ac8-954d-f8c38cacedd1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.565482 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9c18daae-d993-4ac8-954d-f8c38cacedd1\") " pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:40 crc kubenswrapper[4907]: I1009 19:44:40.604079 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 19:44:41 crc kubenswrapper[4907]: E1009 19:44:41.945503 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 09 19:44:41 crc kubenswrapper[4907]: E1009 19:44:41.945971 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rkg5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-8fwdq_openstack(cae8ffa5-4baf-4717-a82a-4357b24aac3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 19:44:41 crc kubenswrapper[4907]: E1009 19:44:41.947147 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" podUID="cae8ffa5-4baf-4717-a82a-4357b24aac3c" Oct 09 19:44:42 crc kubenswrapper[4907]: E1009 19:44:42.034401 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 09 19:44:42 crc kubenswrapper[4907]: E1009 19:44:42.034589 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhlvn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-rhwmc_openstack(e8a61e71-59e4-4060-90c7-5b0b634b8160): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 19:44:42 crc kubenswrapper[4907]: E1009 19:44:42.036103 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" podUID="e8a61e71-59e4-4060-90c7-5b0b634b8160" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.613150 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.621103 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.624363 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.624902 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6z9zb" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.625725 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.627360 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.627524 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.764469 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6r4s\" (UniqueName: \"kubernetes.io/projected/f1723ff9-c43f-463d-903c-11f9b38519e2-kube-api-access-f6r4s\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.764546 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.764581 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1723ff9-c43f-463d-903c-11f9b38519e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.764804 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.764992 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.765044 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1723ff9-c43f-463d-903c-11f9b38519e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.765193 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.765314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1723ff9-c43f-463d-903c-11f9b38519e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.866368 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.866451 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.866534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1723ff9-c43f-463d-903c-11f9b38519e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.866698 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.866766 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.867887 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1723ff9-c43f-463d-903c-11f9b38519e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.867968 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1723ff9-c43f-463d-903c-11f9b38519e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.867996 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6r4s\" (UniqueName: \"kubernetes.io/projected/f1723ff9-c43f-463d-903c-11f9b38519e2-kube-api-access-f6r4s\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.868788 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.868818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1723ff9-c43f-463d-903c-11f9b38519e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.868744 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1723ff9-c43f-463d-903c-11f9b38519e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.869305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1723ff9-c43f-463d-903c-11f9b38519e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.875113 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.878276 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.895885 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.905123 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6r4s\" (UniqueName: \"kubernetes.io/projected/f1723ff9-c43f-463d-903c-11f9b38519e2-kube-api-access-f6r4s\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.911222 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1723ff9-c43f-463d-903c-11f9b38519e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f1723ff9-c43f-463d-903c-11f9b38519e2\") " pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.912431 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.918762 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.927384 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.938027 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.947275 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.949746 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.953350 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.966286 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2"] Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.980133 4907 generic.go:334] "Generic (PLEG): container finished" podID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" containerID="1df610f463c6b9dcc91032330ad82f39fbfebe41308924ad4d453bb98f7cc8c6" exitCode=0 Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.980197 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" event={"ID":"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68","Type":"ContainerDied","Data":"1df610f463c6b9dcc91032330ad82f39fbfebe41308924ad4d453bb98f7cc8c6"} Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.981979 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05cb258e-fa1a-4978-b143-d6c817ec0f96","Type":"ContainerStarted","Data":"8db72d064294617ec75103545fc4ec9b3280e2046c3c55dd8578944fd51d188b"} Oct 09 19:44:42 crc kubenswrapper[4907]: I1009 19:44:42.983068 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"92026239-8122-4224-ae55-be69f2c42a77","Type":"ContainerStarted","Data":"169f52999a1fb4f7a701caf3bb5eb4c8d0cc2b7036da66981b4a65c16de872e1"} Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.003529 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"553f4f9e-5f34-4731-898c-4f0cacf4b545","Type":"ContainerStarted","Data":"eabef965e504f2b7d6b8888ac724316d9fcede217a9dc364cfaad9ef5b4d3183"} Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.012332 4907 generic.go:334] "Generic (PLEG): container finished" podID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" containerID="46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604" exitCode=0 Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.012438 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" event={"ID":"eb8b4400-e16b-45de-b19a-0d312e4f1e51","Type":"ContainerDied","Data":"46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604"} Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.015748 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c","Type":"ContainerStarted","Data":"0f0d1b9d3a7ac5fff3ca4abbd0efc31641719d943ff2330c2266ca66855ae989"} Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.067630 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.450139 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.514009 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.585634 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkg5g\" (UniqueName: \"kubernetes.io/projected/cae8ffa5-4baf-4717-a82a-4357b24aac3c-kube-api-access-rkg5g\") pod \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\" (UID: \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\") " Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.585961 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae8ffa5-4baf-4717-a82a-4357b24aac3c-config\") pod \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\" (UID: \"cae8ffa5-4baf-4717-a82a-4357b24aac3c\") " Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.586764 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae8ffa5-4baf-4717-a82a-4357b24aac3c-config" (OuterVolumeSpecName: "config") pod "cae8ffa5-4baf-4717-a82a-4357b24aac3c" (UID: "cae8ffa5-4baf-4717-a82a-4357b24aac3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.591755 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae8ffa5-4baf-4717-a82a-4357b24aac3c-kube-api-access-rkg5g" (OuterVolumeSpecName: "kube-api-access-rkg5g") pod "cae8ffa5-4baf-4717-a82a-4357b24aac3c" (UID: "cae8ffa5-4baf-4717-a82a-4357b24aac3c"). InnerVolumeSpecName "kube-api-access-rkg5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.653033 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 19:44:43 crc kubenswrapper[4907]: W1009 19:44:43.658227 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1723ff9_c43f_463d_903c_11f9b38519e2.slice/crio-9931139d7ff7c773e7a4ae50760f29d3968efe24fbfdc8abe3cd1630929e3c32 WatchSource:0}: Error finding container 9931139d7ff7c773e7a4ae50760f29d3968efe24fbfdc8abe3cd1630929e3c32: Status 404 returned error can't find the container with id 9931139d7ff7c773e7a4ae50760f29d3968efe24fbfdc8abe3cd1630929e3c32 Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.687276 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhlvn\" (UniqueName: \"kubernetes.io/projected/e8a61e71-59e4-4060-90c7-5b0b634b8160-kube-api-access-bhlvn\") pod \"e8a61e71-59e4-4060-90c7-5b0b634b8160\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.687340 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-dns-svc\") pod \"e8a61e71-59e4-4060-90c7-5b0b634b8160\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.687400 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-config\") pod \"e8a61e71-59e4-4060-90c7-5b0b634b8160\" (UID: \"e8a61e71-59e4-4060-90c7-5b0b634b8160\") " Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.687675 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae8ffa5-4baf-4717-a82a-4357b24aac3c-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.687692 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkg5g\" (UniqueName: \"kubernetes.io/projected/cae8ffa5-4baf-4717-a82a-4357b24aac3c-kube-api-access-rkg5g\") on node \"crc\" DevicePath \"\"" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.688036 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8a61e71-59e4-4060-90c7-5b0b634b8160" (UID: "e8a61e71-59e4-4060-90c7-5b0b634b8160"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.688226 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-config" (OuterVolumeSpecName: "config") pod "e8a61e71-59e4-4060-90c7-5b0b634b8160" (UID: "e8a61e71-59e4-4060-90c7-5b0b634b8160"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.691643 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a61e71-59e4-4060-90c7-5b0b634b8160-kube-api-access-bhlvn" (OuterVolumeSpecName: "kube-api-access-bhlvn") pod "e8a61e71-59e4-4060-90c7-5b0b634b8160" (UID: "e8a61e71-59e4-4060-90c7-5b0b634b8160"). InnerVolumeSpecName "kube-api-access-bhlvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.789315 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhlvn\" (UniqueName: \"kubernetes.io/projected/e8a61e71-59e4-4060-90c7-5b0b634b8160-kube-api-access-bhlvn\") on node \"crc\" DevicePath \"\"" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.789352 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.789363 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a61e71-59e4-4060-90c7-5b0b634b8160-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:44:43 crc kubenswrapper[4907]: I1009 19:44:43.887332 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9z259"] Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.025710 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.025709 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rhwmc" event={"ID":"e8a61e71-59e4-4060-90c7-5b0b634b8160","Type":"ContainerDied","Data":"897d769315fdd4f829d308472841536dea4c86930a6d0a884f6cf2d060860a73"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.027347 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" event={"ID":"cae8ffa5-4baf-4717-a82a-4357b24aac3c","Type":"ContainerDied","Data":"04b889ca8aca502d3a1809f556efbd1ff0a9a9314e4b2612592b437bc757332a"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.027504 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8fwdq" Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.030682 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" event={"ID":"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68","Type":"ContainerStarted","Data":"ac856c785e8f7185dfe4817bca19706e4d4b9e7c6beddd5d88369a6af018e287"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.030793 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.045274 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2" event={"ID":"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02","Type":"ContainerStarted","Data":"f69f868793897f42de0aae8a228203c1f5409d96f7cea1b40ec8370509b3f8a1"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.052661 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" event={"ID":"eb8b4400-e16b-45de-b19a-0d312e4f1e51","Type":"ContainerStarted","Data":"2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.052780 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.054076 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f1723ff9-c43f-463d-903c-11f9b38519e2","Type":"ContainerStarted","Data":"9931139d7ff7c773e7a4ae50760f29d3968efe24fbfdc8abe3cd1630929e3c32"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.055878 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"51120f59-aafb-4105-b919-fe8e4fc20f93","Type":"ContainerStarted","Data":"a02b015054edea7b95db0ef095ac8af7c0984d7f0b5a0bbd67d3b984da7bf598"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.057690 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"892437ea-977d-434a-ba03-2ce726fb21b0","Type":"ContainerStarted","Data":"54b999dff7deae434d4d478fcfe9e76d24976dc12a367a040af3c8838ac9fb92"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.059012 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9c18daae-d993-4ac8-954d-f8c38cacedd1","Type":"ContainerStarted","Data":"4c6298f34db7e8fc578f18da39310b1706faced9671f899a44fd711e3eb94757"} Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.061916 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" podStartSLOduration=3.190602352 podStartE2EDuration="15.061902484s" podCreationTimestamp="2025-10-09 19:44:29 +0000 UTC" firstStartedPulling="2025-10-09 19:44:30.257999133 +0000 UTC m=+955.789966622" lastFinishedPulling="2025-10-09 19:44:42.129299265 +0000 UTC m=+967.661266754" observedRunningTime="2025-10-09 19:44:44.061263868 +0000 UTC m=+969.593231357" watchObservedRunningTime="2025-10-09 19:44:44.061902484 +0000 UTC m=+969.593869973" Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.097000 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8fwdq"] Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.102538 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8fwdq"] Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.108826 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" podStartSLOduration=6.747374884 podStartE2EDuration="15.108805933s" podCreationTimestamp="2025-10-09 19:44:29 +0000 UTC" firstStartedPulling="2025-10-09 19:44:33.773583063 +0000 UTC m=+959.305550592" lastFinishedPulling="2025-10-09 19:44:42.135014152 +0000 UTC m=+967.666981641" observedRunningTime="2025-10-09 19:44:44.101857154 +0000 UTC m=+969.633824653" watchObservedRunningTime="2025-10-09 19:44:44.108805933 +0000 UTC m=+969.640773432" Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.151057 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rhwmc"] Oct 09 19:44:44 crc kubenswrapper[4907]: I1009 19:44:44.156089 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rhwmc"] Oct 09 19:44:44 crc kubenswrapper[4907]: W1009 19:44:44.284056 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0f67e81_b9c9_419e_bc68_dcc44ac15f4d.slice/crio-f9df0a5648253e40ead498e3daee6a9a3d930844eb7f16f0d71623c2e40e20e9 WatchSource:0}: Error finding container f9df0a5648253e40ead498e3daee6a9a3d930844eb7f16f0d71623c2e40e20e9: Status 404 returned error can't find the container with id f9df0a5648253e40ead498e3daee6a9a3d930844eb7f16f0d71623c2e40e20e9 Oct 09 19:44:45 crc kubenswrapper[4907]: I1009 19:44:45.069490 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9z259" event={"ID":"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d","Type":"ContainerStarted","Data":"f9df0a5648253e40ead498e3daee6a9a3d930844eb7f16f0d71623c2e40e20e9"} Oct 09 19:44:45 crc kubenswrapper[4907]: I1009 19:44:45.162134 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae8ffa5-4baf-4717-a82a-4357b24aac3c" path="/var/lib/kubelet/pods/cae8ffa5-4baf-4717-a82a-4357b24aac3c/volumes" Oct 09 19:44:45 crc kubenswrapper[4907]: I1009 19:44:45.162521 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a61e71-59e4-4060-90c7-5b0b634b8160" path="/var/lib/kubelet/pods/e8a61e71-59e4-4060-90c7-5b0b634b8160/volumes" Oct 09 19:44:49 crc kubenswrapper[4907]: I1009 19:44:49.755669 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:50 crc kubenswrapper[4907]: I1009 19:44:50.042666 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:44:50 crc kubenswrapper[4907]: I1009 19:44:50.117665 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9q9xw"] Oct 09 19:44:50 crc kubenswrapper[4907]: I1009 19:44:50.117905 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" podUID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" containerName="dnsmasq-dns" containerID="cri-o://ac856c785e8f7185dfe4817bca19706e4d4b9e7c6beddd5d88369a6af018e287" gracePeriod=10 Oct 09 19:44:51 crc kubenswrapper[4907]: I1009 19:44:51.123679 4907 generic.go:334] "Generic (PLEG): container finished" podID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" containerID="ac856c785e8f7185dfe4817bca19706e4d4b9e7c6beddd5d88369a6af018e287" exitCode=0 Oct 09 19:44:51 crc kubenswrapper[4907]: I1009 19:44:51.124011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" event={"ID":"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68","Type":"ContainerDied","Data":"ac856c785e8f7185dfe4817bca19706e4d4b9e7c6beddd5d88369a6af018e287"} Oct 09 19:44:51 crc kubenswrapper[4907]: I1009 19:44:51.865799 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.050116 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-dns-svc\") pod \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.050394 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8tld\" (UniqueName: \"kubernetes.io/projected/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-kube-api-access-w8tld\") pod \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.050726 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-config\") pod \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\" (UID: \"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68\") " Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.095576 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-kube-api-access-w8tld" (OuterVolumeSpecName: "kube-api-access-w8tld") pod "6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" (UID: "6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68"). InnerVolumeSpecName "kube-api-access-w8tld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.137341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" event={"ID":"6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68","Type":"ContainerDied","Data":"3232b619bf623012af9af8902b4081e0616c3fc1bb73dfef61a2056c8ddcbc7c"} Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.137390 4907 scope.go:117] "RemoveContainer" containerID="ac856c785e8f7185dfe4817bca19706e4d4b9e7c6beddd5d88369a6af018e287" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.137531 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9q9xw" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.152232 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8tld\" (UniqueName: \"kubernetes.io/projected/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-kube-api-access-w8tld\") on node \"crc\" DevicePath \"\"" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.153947 4907 scope.go:117] "RemoveContainer" containerID="1df610f463c6b9dcc91032330ad82f39fbfebe41308924ad4d453bb98f7cc8c6" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.574388 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-config" (OuterVolumeSpecName: "config") pod "6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" (UID: "6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.589663 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" (UID: "6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.660728 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.660777 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.788441 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9q9xw"] Oct 09 19:44:52 crc kubenswrapper[4907]: I1009 19:44:52.796610 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9q9xw"] Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.145154 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"553f4f9e-5f34-4731-898c-4f0cacf4b545","Type":"ContainerStarted","Data":"c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f"} Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.146189 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.147443 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2" event={"ID":"c9bf7943-cd49-4a26-83e2-9efc4c9dcc02","Type":"ContainerStarted","Data":"4d8639ffd97cf142d814e8197cee6337e9e4df1024e92cff4857e0c486f79fb7"} Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.147887 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dz7f2" Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.149000 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9c18daae-d993-4ac8-954d-f8c38cacedd1","Type":"ContainerStarted","Data":"5484a93f642d40bc5874e848def12023a48196fd4193f1746a618f68248833d4"} Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.150247 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9z259" event={"ID":"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d","Type":"ContainerStarted","Data":"82b054d3db40358bc33d4f6584a95e0ad38361cfd1d4daeab8039c53826c68a1"} Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.162913 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" path="/var/lib/kubelet/pods/6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68/volumes" Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.164086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"51120f59-aafb-4105-b919-fe8e4fc20f93","Type":"ContainerStarted","Data":"a73bea4f6e56482e6741357ba1c080e07d39f55b1de29d0c7ff9b04b91c0f94d"} Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.164159 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.164181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"892437ea-977d-434a-ba03-2ce726fb21b0","Type":"ContainerStarted","Data":"2497c10528e96ef5d11ff35fe1c8c79f0d2f58642c50ec77e993c5339fe9e2f5"} Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.164201 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"92026239-8122-4224-ae55-be69f2c42a77","Type":"ContainerStarted","Data":"1523fda67dc99b26de7f30a2d87f5a178b624f3ba6ceea21341e40feb7de5d62"} Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.168596 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.139131316 podStartE2EDuration="18.168574142s" podCreationTimestamp="2025-10-09 19:44:35 +0000 UTC" firstStartedPulling="2025-10-09 19:44:42.928553446 +0000 UTC m=+968.460520935" lastFinishedPulling="2025-10-09 19:44:51.957996262 +0000 UTC m=+977.489963761" observedRunningTime="2025-10-09 19:44:53.163017774 +0000 UTC m=+978.694985283" watchObservedRunningTime="2025-10-09 19:44:53.168574142 +0000 UTC m=+978.700541641" Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.189703 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dz7f2" podStartSLOduration=5.792626126 podStartE2EDuration="14.18968378s" podCreationTimestamp="2025-10-09 19:44:39 +0000 UTC" firstStartedPulling="2025-10-09 19:44:43.014137411 +0000 UTC m=+968.546104900" lastFinishedPulling="2025-10-09 19:44:51.411195065 +0000 UTC m=+976.943162554" observedRunningTime="2025-10-09 19:44:53.180300453 +0000 UTC m=+978.712267952" watchObservedRunningTime="2025-10-09 19:44:53.18968378 +0000 UTC m=+978.721651269" Oct 09 19:44:53 crc kubenswrapper[4907]: I1009 19:44:53.245698 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.971851408 podStartE2EDuration="19.245678564s" podCreationTimestamp="2025-10-09 19:44:34 +0000 UTC" firstStartedPulling="2025-10-09 19:44:43.013973797 +0000 UTC m=+968.545941286" lastFinishedPulling="2025-10-09 19:44:51.287800943 +0000 UTC m=+976.819768442" observedRunningTime="2025-10-09 19:44:53.235094709 +0000 UTC m=+978.767062198" watchObservedRunningTime="2025-10-09 19:44:53.245678564 +0000 UTC m=+978.777646063" Oct 09 19:44:54 crc kubenswrapper[4907]: I1009 19:44:54.172725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c","Type":"ContainerStarted","Data":"6e458b11269c1cb13b676c476b97734eb508c7576d9ba6e4b28cb7cd48f34f70"} Oct 09 19:44:54 crc kubenswrapper[4907]: I1009 19:44:54.176235 4907 generic.go:334] "Generic (PLEG): container finished" podID="c0f67e81-b9c9-419e-bc68-dcc44ac15f4d" containerID="82b054d3db40358bc33d4f6584a95e0ad38361cfd1d4daeab8039c53826c68a1" exitCode=0 Oct 09 19:44:54 crc kubenswrapper[4907]: I1009 19:44:54.176329 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9z259" event={"ID":"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d","Type":"ContainerDied","Data":"82b054d3db40358bc33d4f6584a95e0ad38361cfd1d4daeab8039c53826c68a1"} Oct 09 19:44:54 crc kubenswrapper[4907]: I1009 19:44:54.180209 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f1723ff9-c43f-463d-903c-11f9b38519e2","Type":"ContainerStarted","Data":"bb2101ad28f6df921e626854b397e55e59140bbf962e89e996183a72c8332733"} Oct 09 19:44:54 crc kubenswrapper[4907]: I1009 19:44:54.185231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05cb258e-fa1a-4978-b143-d6c817ec0f96","Type":"ContainerStarted","Data":"90127cac8301cedbb378bf90ca8583c35071b8c22a6f7cf9a0d8fbe7ef6cfe53"} Oct 09 19:44:55 crc kubenswrapper[4907]: I1009 19:44:55.211600 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9z259" event={"ID":"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d","Type":"ContainerStarted","Data":"9770a48c8fbcf5bc9e5c164c60cbb9762a5a391935051d049db3c8c391f22449"} Oct 09 19:44:56 crc kubenswrapper[4907]: I1009 19:44:56.223753 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9z259" event={"ID":"c0f67e81-b9c9-419e-bc68-dcc44ac15f4d","Type":"ContainerStarted","Data":"3b6cf4926f6b8dccce1c8f8f8e011eacfbdfdbb4c4e0ae34fddb54a290029bd6"} Oct 09 19:44:56 crc kubenswrapper[4907]: I1009 19:44:56.224237 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:56 crc kubenswrapper[4907]: I1009 19:44:56.224252 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:44:56 crc kubenswrapper[4907]: I1009 19:44:56.248498 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9z259" podStartSLOduration=9.684307496 podStartE2EDuration="17.248481236s" podCreationTimestamp="2025-10-09 19:44:39 +0000 UTC" firstStartedPulling="2025-10-09 19:44:44.294368183 +0000 UTC m=+969.826335672" lastFinishedPulling="2025-10-09 19:44:51.858541923 +0000 UTC m=+977.390509412" observedRunningTime="2025-10-09 19:44:56.242232912 +0000 UTC m=+981.774200411" watchObservedRunningTime="2025-10-09 19:44:56.248481236 +0000 UTC m=+981.780448725" Oct 09 19:44:57 crc kubenswrapper[4907]: I1009 19:44:57.234329 4907 generic.go:334] "Generic (PLEG): container finished" podID="892437ea-977d-434a-ba03-2ce726fb21b0" containerID="2497c10528e96ef5d11ff35fe1c8c79f0d2f58642c50ec77e993c5339fe9e2f5" exitCode=0 Oct 09 19:44:57 crc kubenswrapper[4907]: I1009 19:44:57.234440 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"892437ea-977d-434a-ba03-2ce726fb21b0","Type":"ContainerDied","Data":"2497c10528e96ef5d11ff35fe1c8c79f0d2f58642c50ec77e993c5339fe9e2f5"} Oct 09 19:44:58 crc kubenswrapper[4907]: I1009 19:44:58.245046 4907 generic.go:334] "Generic (PLEG): container finished" podID="92026239-8122-4224-ae55-be69f2c42a77" containerID="1523fda67dc99b26de7f30a2d87f5a178b624f3ba6ceea21341e40feb7de5d62" exitCode=0 Oct 09 19:44:58 crc kubenswrapper[4907]: I1009 19:44:58.245094 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"92026239-8122-4224-ae55-be69f2c42a77","Type":"ContainerDied","Data":"1523fda67dc99b26de7f30a2d87f5a178b624f3ba6ceea21341e40feb7de5d62"} Oct 09 19:44:59 crc kubenswrapper[4907]: I1009 19:44:59.466213 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.141944 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf"] Oct 09 19:45:00 crc kubenswrapper[4907]: E1009 19:45:00.142246 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" containerName="init" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.142262 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" containerName="init" Oct 09 19:45:00 crc kubenswrapper[4907]: E1009 19:45:00.142293 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" containerName="dnsmasq-dns" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.142299 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" containerName="dnsmasq-dns" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.142499 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa1d257-8d6f-43ff-bfa1-e40b2a12ee68" containerName="dnsmasq-dns" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.142992 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.144742 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.148334 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.159778 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf"] Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.181408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/072c0854-7d0a-4f81-8060-50e85811eeb0-secret-volume\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.181481 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj5db\" (UniqueName: \"kubernetes.io/projected/072c0854-7d0a-4f81-8060-50e85811eeb0-kube-api-access-vj5db\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.181586 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/072c0854-7d0a-4f81-8060-50e85811eeb0-config-volume\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.282545 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/072c0854-7d0a-4f81-8060-50e85811eeb0-config-volume\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.282651 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/072c0854-7d0a-4f81-8060-50e85811eeb0-secret-volume\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.282690 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj5db\" (UniqueName: \"kubernetes.io/projected/072c0854-7d0a-4f81-8060-50e85811eeb0-kube-api-access-vj5db\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.283737 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/072c0854-7d0a-4f81-8060-50e85811eeb0-config-volume\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.288051 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/072c0854-7d0a-4f81-8060-50e85811eeb0-secret-volume\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.305441 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj5db\" (UniqueName: \"kubernetes.io/projected/072c0854-7d0a-4f81-8060-50e85811eeb0-kube-api-access-vj5db\") pod \"collect-profiles-29333985-7ktjf\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:00 crc kubenswrapper[4907]: I1009 19:45:00.462127 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.064176 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fhw7x"] Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.069493 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.080490 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.081668 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fhw7x"] Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.117883 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf"] Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.128186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.128233 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-ovs-rundir\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.128274 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-combined-ca-bundle\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.128327 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-ovn-rundir\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.128353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-config\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.128406 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrhnk\" (UniqueName: \"kubernetes.io/projected/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-kube-api-access-mrhnk\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.217933 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qgq4m"] Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.219245 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.226060 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.230272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.230318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-ovs-rundir\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.230355 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-combined-ca-bundle\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.230417 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-ovn-rundir\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.230456 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-config\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.230544 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrhnk\" (UniqueName: \"kubernetes.io/projected/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-kube-api-access-mrhnk\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.231823 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-ovs-rundir\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.232306 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-ovn-rundir\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.233291 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-config\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.234545 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qgq4m"] Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.236305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-combined-ca-bundle\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.237886 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.253913 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrhnk\" (UniqueName: \"kubernetes.io/projected/ed4e1f67-4cb8-4d46-823b-fb81e37d63c1-kube-api-access-mrhnk\") pod \"ovn-controller-metrics-fhw7x\" (UID: \"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1\") " pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.288392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" event={"ID":"072c0854-7d0a-4f81-8060-50e85811eeb0","Type":"ContainerStarted","Data":"35d0fc1454085a01bdb9f8d54257b8f02adecc2ebd9fa1ca0f888521c146e2d8"} Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.290120 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f1723ff9-c43f-463d-903c-11f9b38519e2","Type":"ContainerStarted","Data":"5781e67d03548084094a2c48e7a6eb3337e941f539bcc6c0b29731099b016eb5"} Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.294764 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"892437ea-977d-434a-ba03-2ce726fb21b0","Type":"ContainerStarted","Data":"08f29cf1a7135b60d0e88ca3ba13ff1ee2c6ee773d5abbf1994f302bde43d38a"} Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.302451 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"92026239-8122-4224-ae55-be69f2c42a77","Type":"ContainerStarted","Data":"16c40a3660f22f876054688238827754e60ab8aa41a47c2867480be9809300f7"} Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.315062 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9c18daae-d993-4ac8-954d-f8c38cacedd1","Type":"ContainerStarted","Data":"6931e86caeaea09f89fbac0a6d7bda72d079205d0b41a4ae8532551c759f6fbf"} Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.326936 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.261542575 podStartE2EDuration="22.326919497s" podCreationTimestamp="2025-10-09 19:44:41 +0000 UTC" firstStartedPulling="2025-10-09 19:44:43.660662007 +0000 UTC m=+969.192629496" lastFinishedPulling="2025-10-09 19:45:02.726038929 +0000 UTC m=+988.258006418" observedRunningTime="2025-10-09 19:45:03.325032084 +0000 UTC m=+988.856999573" watchObservedRunningTime="2025-10-09 19:45:03.326919497 +0000 UTC m=+988.858886986" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.364460 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.7571563260000005 podStartE2EDuration="24.364435784s" podCreationTimestamp="2025-10-09 19:44:39 +0000 UTC" firstStartedPulling="2025-10-09 19:44:43.094146032 +0000 UTC m=+968.626113521" lastFinishedPulling="2025-10-09 19:45:02.70142548 +0000 UTC m=+988.233392979" observedRunningTime="2025-10-09 19:45:03.34823636 +0000 UTC m=+988.880203859" watchObservedRunningTime="2025-10-09 19:45:03.364435784 +0000 UTC m=+988.896403273" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.402777 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fhw7x" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.416703 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.020573601 podStartE2EDuration="31.416685052s" podCreationTimestamp="2025-10-09 19:44:32 +0000 UTC" firstStartedPulling="2025-10-09 19:44:43.014555412 +0000 UTC m=+968.546522901" lastFinishedPulling="2025-10-09 19:44:51.410666863 +0000 UTC m=+976.942634352" observedRunningTime="2025-10-09 19:45:03.388870449 +0000 UTC m=+988.920837938" watchObservedRunningTime="2025-10-09 19:45:03.416685052 +0000 UTC m=+988.948652541" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.417554 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.396092523 podStartE2EDuration="31.417548002s" podCreationTimestamp="2025-10-09 19:44:32 +0000 UTC" firstStartedPulling="2025-10-09 19:44:42.910581953 +0000 UTC m=+968.442549442" lastFinishedPulling="2025-10-09 19:44:51.932037432 +0000 UTC m=+977.464004921" observedRunningTime="2025-10-09 19:45:03.413726504 +0000 UTC m=+988.945693993" watchObservedRunningTime="2025-10-09 19:45:03.417548002 +0000 UTC m=+988.949515491" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.435257 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.435433 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4s9d\" (UniqueName: \"kubernetes.io/projected/013a6081-32a1-400a-8471-5d59718d543f-kube-api-access-k4s9d\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.435546 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-config\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.435645 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.504019 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qgq4m"] Oct 09 19:45:03 crc kubenswrapper[4907]: E1009 19:45:03.504866 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-k4s9d ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" podUID="013a6081-32a1-400a-8471-5d59718d543f" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.536847 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4s9d\" (UniqueName: \"kubernetes.io/projected/013a6081-32a1-400a-8471-5d59718d543f-kube-api-access-k4s9d\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.536912 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-config\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.536950 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.536975 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.537721 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.538676 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.538881 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-config\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.562517 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4s9d\" (UniqueName: \"kubernetes.io/projected/013a6081-32a1-400a-8471-5d59718d543f-kube-api-access-k4s9d\") pod \"dnsmasq-dns-7f896c8c65-qgq4m\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.563562 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gnmbc"] Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.564827 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.567017 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.588578 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gnmbc"] Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.643777 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.643835 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-config\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.643878 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.643905 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4bg\" (UniqueName: \"kubernetes.io/projected/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-kube-api-access-qp4bg\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.643924 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.753324 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.753382 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-config\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.753428 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.753455 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4bg\" (UniqueName: \"kubernetes.io/projected/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-kube-api-access-qp4bg\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.753489 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.754223 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.754309 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.754511 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.754766 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-config\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.784899 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4bg\" (UniqueName: \"kubernetes.io/projected/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-kube-api-access-qp4bg\") pod \"dnsmasq-dns-86db49b7ff-gnmbc\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.788654 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.788688 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.883193 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.951028 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 09 19:45:03 crc kubenswrapper[4907]: I1009 19:45:03.985384 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fhw7x"] Oct 09 19:45:03 crc kubenswrapper[4907]: W1009 19:45:03.995382 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded4e1f67_4cb8_4d46_823b_fb81e37d63c1.slice/crio-05ce5d16baeec2964a9bed8398592c948747092adcaa7e64c08fe5c661f6aef4 WatchSource:0}: Error finding container 05ce5d16baeec2964a9bed8398592c948747092adcaa7e64c08fe5c661f6aef4: Status 404 returned error can't find the container with id 05ce5d16baeec2964a9bed8398592c948747092adcaa7e64c08fe5c661f6aef4 Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.003543 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.144440 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.144944 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.309921 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gnmbc"] Oct 09 19:45:04 crc kubenswrapper[4907]: W1009 19:45:04.318764 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eb85480_a1d3_47cc_ba0a_d5bdc877342d.slice/crio-0879cc71f941e5ab387ea619ace304de50be42612121ebb943da0469c4356bce WatchSource:0}: Error finding container 0879cc71f941e5ab387ea619ace304de50be42612121ebb943da0469c4356bce: Status 404 returned error can't find the container with id 0879cc71f941e5ab387ea619ace304de50be42612121ebb943da0469c4356bce Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.324419 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fhw7x" event={"ID":"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1","Type":"ContainerStarted","Data":"36e6ee22c221463a934cd4c832beced88921cc9ec1999e81cc7d2639ca9be2f3"} Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.324487 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fhw7x" event={"ID":"ed4e1f67-4cb8-4d46-823b-fb81e37d63c1","Type":"ContainerStarted","Data":"05ce5d16baeec2964a9bed8398592c948747092adcaa7e64c08fe5c661f6aef4"} Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.325666 4907 generic.go:334] "Generic (PLEG): container finished" podID="072c0854-7d0a-4f81-8060-50e85811eeb0" containerID="83dea5c6e16336dcab69d4448d09f6308f45fea94c1620c2384a6edeb8ddc627" exitCode=0 Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.325771 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" event={"ID":"072c0854-7d0a-4f81-8060-50e85811eeb0","Type":"ContainerDied","Data":"83dea5c6e16336dcab69d4448d09f6308f45fea94c1620c2384a6edeb8ddc627"} Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.326076 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.326591 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.385761 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.404569 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fhw7x" podStartSLOduration=1.404546374 podStartE2EDuration="1.404546374s" podCreationTimestamp="2025-10-09 19:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:45:04.354963008 +0000 UTC m=+989.886930517" watchObservedRunningTime="2025-10-09 19:45:04.404546374 +0000 UTC m=+989.936513863" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.450295 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.562071 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-dns-svc\") pod \"013a6081-32a1-400a-8471-5d59718d543f\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.562138 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4s9d\" (UniqueName: \"kubernetes.io/projected/013a6081-32a1-400a-8471-5d59718d543f-kube-api-access-k4s9d\") pod \"013a6081-32a1-400a-8471-5d59718d543f\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.562236 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-ovsdbserver-sb\") pod \"013a6081-32a1-400a-8471-5d59718d543f\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.562302 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-config\") pod \"013a6081-32a1-400a-8471-5d59718d543f\" (UID: \"013a6081-32a1-400a-8471-5d59718d543f\") " Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.563055 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "013a6081-32a1-400a-8471-5d59718d543f" (UID: "013a6081-32a1-400a-8471-5d59718d543f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.563111 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "013a6081-32a1-400a-8471-5d59718d543f" (UID: "013a6081-32a1-400a-8471-5d59718d543f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.563498 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-config" (OuterVolumeSpecName: "config") pod "013a6081-32a1-400a-8471-5d59718d543f" (UID: "013a6081-32a1-400a-8471-5d59718d543f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.572599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013a6081-32a1-400a-8471-5d59718d543f-kube-api-access-k4s9d" (OuterVolumeSpecName: "kube-api-access-k4s9d") pod "013a6081-32a1-400a-8471-5d59718d543f" (UID: "013a6081-32a1-400a-8471-5d59718d543f"). InnerVolumeSpecName "kube-api-access-k4s9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.604772 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.655230 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.664550 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.664582 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4s9d\" (UniqueName: \"kubernetes.io/projected/013a6081-32a1-400a-8471-5d59718d543f-kube-api-access-k4s9d\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.664594 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:04 crc kubenswrapper[4907]: I1009 19:45:04.664604 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013a6081-32a1-400a-8471-5d59718d543f-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.334216 4907 generic.go:334] "Generic (PLEG): container finished" podID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" containerID="45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5" exitCode=0 Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.334459 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" event={"ID":"0eb85480-a1d3-47cc-ba0a-d5bdc877342d","Type":"ContainerDied","Data":"45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5"} Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.334607 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" event={"ID":"0eb85480-a1d3-47cc-ba0a-d5bdc877342d","Type":"ContainerStarted","Data":"0879cc71f941e5ab387ea619ace304de50be42612121ebb943da0469c4356bce"} Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.334870 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-qgq4m" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.335306 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.398009 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qgq4m"] Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.407425 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-qgq4m"] Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.411095 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.677069 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.682261 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.682415 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.684782 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.685101 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7cmfp" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.685266 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.693116 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.694776 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.694818 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhmtp\" (UniqueName: \"kubernetes.io/projected/97065ead-95b6-46d3-ab20-a073f6b5f243-kube-api-access-fhmtp\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.694874 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.694946 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.695009 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97065ead-95b6-46d3-ab20-a073f6b5f243-scripts\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.695036 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97065ead-95b6-46d3-ab20-a073f6b5f243-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.695112 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97065ead-95b6-46d3-ab20-a073f6b5f243-config\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.723338 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.796913 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj5db\" (UniqueName: \"kubernetes.io/projected/072c0854-7d0a-4f81-8060-50e85811eeb0-kube-api-access-vj5db\") pod \"072c0854-7d0a-4f81-8060-50e85811eeb0\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.797326 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.797413 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhmtp\" (UniqueName: \"kubernetes.io/projected/97065ead-95b6-46d3-ab20-a073f6b5f243-kube-api-access-fhmtp\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.797513 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.797649 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.797762 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97065ead-95b6-46d3-ab20-a073f6b5f243-scripts\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.797834 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97065ead-95b6-46d3-ab20-a073f6b5f243-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.797933 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97065ead-95b6-46d3-ab20-a073f6b5f243-config\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.798289 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97065ead-95b6-46d3-ab20-a073f6b5f243-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.798630 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97065ead-95b6-46d3-ab20-a073f6b5f243-scripts\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.798812 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97065ead-95b6-46d3-ab20-a073f6b5f243-config\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.801879 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/072c0854-7d0a-4f81-8060-50e85811eeb0-kube-api-access-vj5db" (OuterVolumeSpecName: "kube-api-access-vj5db") pod "072c0854-7d0a-4f81-8060-50e85811eeb0" (UID: "072c0854-7d0a-4f81-8060-50e85811eeb0"). InnerVolumeSpecName "kube-api-access-vj5db". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.802644 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.802755 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.802957 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97065ead-95b6-46d3-ab20-a073f6b5f243-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.814287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhmtp\" (UniqueName: \"kubernetes.io/projected/97065ead-95b6-46d3-ab20-a073f6b5f243-kube-api-access-fhmtp\") pod \"ovn-northd-0\" (UID: \"97065ead-95b6-46d3-ab20-a073f6b5f243\") " pod="openstack/ovn-northd-0" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.898799 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/072c0854-7d0a-4f81-8060-50e85811eeb0-secret-volume\") pod \"072c0854-7d0a-4f81-8060-50e85811eeb0\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.899011 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/072c0854-7d0a-4f81-8060-50e85811eeb0-config-volume\") pod \"072c0854-7d0a-4f81-8060-50e85811eeb0\" (UID: \"072c0854-7d0a-4f81-8060-50e85811eeb0\") " Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.899546 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj5db\" (UniqueName: \"kubernetes.io/projected/072c0854-7d0a-4f81-8060-50e85811eeb0-kube-api-access-vj5db\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.899871 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072c0854-7d0a-4f81-8060-50e85811eeb0-config-volume" (OuterVolumeSpecName: "config-volume") pod "072c0854-7d0a-4f81-8060-50e85811eeb0" (UID: "072c0854-7d0a-4f81-8060-50e85811eeb0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:05 crc kubenswrapper[4907]: I1009 19:45:05.901768 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/072c0854-7d0a-4f81-8060-50e85811eeb0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "072c0854-7d0a-4f81-8060-50e85811eeb0" (UID: "072c0854-7d0a-4f81-8060-50e85811eeb0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.001623 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/072c0854-7d0a-4f81-8060-50e85811eeb0-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.001665 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/072c0854-7d0a-4f81-8060-50e85811eeb0-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.007611 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gnmbc"] Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.029136 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.089001 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc2mk"] Oct 09 19:45:06 crc kubenswrapper[4907]: E1009 19:45:06.091111 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072c0854-7d0a-4f81-8060-50e85811eeb0" containerName="collect-profiles" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.091140 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="072c0854-7d0a-4f81-8060-50e85811eeb0" containerName="collect-profiles" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.091370 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="072c0854-7d0a-4f81-8060-50e85811eeb0" containerName="collect-profiles" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.092521 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.117755 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc2mk"] Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.214511 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.214625 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlbwg\" (UniqueName: \"kubernetes.io/projected/c23c05c5-d7fc-4792-be12-5cfabce11bf7-kube-api-access-nlbwg\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.214693 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-config\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.214753 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-dns-svc\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.214774 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.250826 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.318898 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlbwg\" (UniqueName: \"kubernetes.io/projected/c23c05c5-d7fc-4792-be12-5cfabce11bf7-kube-api-access-nlbwg\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.318966 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-config\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.319003 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-dns-svc\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.319017 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.319046 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.320763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.324031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-config\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.324701 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-dns-svc\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.326817 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.356320 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlbwg\" (UniqueName: \"kubernetes.io/projected/c23c05c5-d7fc-4792-be12-5cfabce11bf7-kube-api-access-nlbwg\") pod \"dnsmasq-dns-698758b865-gc2mk\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.358406 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" event={"ID":"072c0854-7d0a-4f81-8060-50e85811eeb0","Type":"ContainerDied","Data":"35d0fc1454085a01bdb9f8d54257b8f02adecc2ebd9fa1ca0f888521c146e2d8"} Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.358448 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35d0fc1454085a01bdb9f8d54257b8f02adecc2ebd9fa1ca0f888521c146e2d8" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.358520 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.369769 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" event={"ID":"0eb85480-a1d3-47cc-ba0a-d5bdc877342d","Type":"ContainerStarted","Data":"fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52"} Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.370518 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.397075 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" podStartSLOduration=3.397058716 podStartE2EDuration="3.397058716s" podCreationTimestamp="2025-10-09 19:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:45:06.392186173 +0000 UTC m=+991.924153662" watchObservedRunningTime="2025-10-09 19:45:06.397058716 +0000 UTC m=+991.929026205" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.420834 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.714292 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 19:45:06 crc kubenswrapper[4907]: I1009 19:45:06.905006 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc2mk"] Oct 09 19:45:06 crc kubenswrapper[4907]: W1009 19:45:06.906774 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23c05c5_d7fc_4792_be12_5cfabce11bf7.slice/crio-85089beccb6f892b52b44f1e7cc60ff231c911ae4c1a2c53d5ed095bd7cda3b5 WatchSource:0}: Error finding container 85089beccb6f892b52b44f1e7cc60ff231c911ae4c1a2c53d5ed095bd7cda3b5: Status 404 returned error can't find the container with id 85089beccb6f892b52b44f1e7cc60ff231c911ae4c1a2c53d5ed095bd7cda3b5 Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.162268 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013a6081-32a1-400a-8471-5d59718d543f" path="/var/lib/kubelet/pods/013a6081-32a1-400a-8471-5d59718d543f/volumes" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.240214 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.247726 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.250805 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.251026 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.251290 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gmwtl" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.251371 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.262683 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.336015 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd0a266f-be5f-4162-87fb-7389f11c37ab-lock\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.336063 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd0a266f-be5f-4162-87fb-7389f11c37ab-cache\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.336116 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.336136 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.336183 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjv8\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-kube-api-access-hzjv8\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.377289 4907 generic.go:334] "Generic (PLEG): container finished" podID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerID="13c3fa1be888b2237dfe74705489ea3d9316c6bad630fc0092bd2ee2bd4c6a82" exitCode=0 Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.377353 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc2mk" event={"ID":"c23c05c5-d7fc-4792-be12-5cfabce11bf7","Type":"ContainerDied","Data":"13c3fa1be888b2237dfe74705489ea3d9316c6bad630fc0092bd2ee2bd4c6a82"} Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.377379 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc2mk" event={"ID":"c23c05c5-d7fc-4792-be12-5cfabce11bf7","Type":"ContainerStarted","Data":"85089beccb6f892b52b44f1e7cc60ff231c911ae4c1a2c53d5ed095bd7cda3b5"} Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.378676 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"97065ead-95b6-46d3-ab20-a073f6b5f243","Type":"ContainerStarted","Data":"b46df0745ec83355e2ac433e0716d793cb47bc97ee030dea2ed2a727dad1c0ce"} Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.378867 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" podUID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" containerName="dnsmasq-dns" containerID="cri-o://fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52" gracePeriod=10 Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.437555 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjv8\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-kube-api-access-hzjv8\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.437686 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd0a266f-be5f-4162-87fb-7389f11c37ab-lock\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.437713 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd0a266f-be5f-4162-87fb-7389f11c37ab-cache\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.437800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.437816 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.444413 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fd0a266f-be5f-4162-87fb-7389f11c37ab-lock\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.446008 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.446805 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fd0a266f-be5f-4162-87fb-7389f11c37ab-cache\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: E1009 19:45:07.446993 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 19:45:07 crc kubenswrapper[4907]: E1009 19:45:07.447016 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 19:45:07 crc kubenswrapper[4907]: E1009 19:45:07.447066 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift podName:fd0a266f-be5f-4162-87fb-7389f11c37ab nodeName:}" failed. No retries permitted until 2025-10-09 19:45:07.947045994 +0000 UTC m=+993.479013583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift") pod "swift-storage-0" (UID: "fd0a266f-be5f-4162-87fb-7389f11c37ab") : configmap "swift-ring-files" not found Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.478967 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-b2l5z"] Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.480221 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.505408 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.505417 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjv8\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-kube-api-access-hzjv8\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.505545 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.505632 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.511733 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.528521 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-b2l5z"] Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.539420 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-combined-ca-bundle\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.539489 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn7gh\" (UniqueName: \"kubernetes.io/projected/bba198fe-5d7c-4f5c-a820-ddf9978aed83-kube-api-access-zn7gh\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.539511 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-dispersionconf\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.539547 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bba198fe-5d7c-4f5c-a820-ddf9978aed83-etc-swift\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.539805 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-scripts\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.539875 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-swiftconf\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.539907 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-ring-data-devices\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.641250 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-swiftconf\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.641296 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-ring-data-devices\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.641347 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-combined-ca-bundle\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.641370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn7gh\" (UniqueName: \"kubernetes.io/projected/bba198fe-5d7c-4f5c-a820-ddf9978aed83-kube-api-access-zn7gh\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.641388 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-dispersionconf\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.641422 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bba198fe-5d7c-4f5c-a820-ddf9978aed83-etc-swift\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.641514 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-scripts\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.642139 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-scripts\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.642834 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-ring-data-devices\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.646019 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-swiftconf\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.654587 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bba198fe-5d7c-4f5c-a820-ddf9978aed83-etc-swift\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.655096 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-combined-ca-bundle\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.655441 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-dispersionconf\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.665009 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn7gh\" (UniqueName: \"kubernetes.io/projected/bba198fe-5d7c-4f5c-a820-ddf9978aed83-kube-api-access-zn7gh\") pod \"swift-ring-rebalance-b2l5z\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.827125 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.945437 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.947550 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-config\") pod \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.947586 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp4bg\" (UniqueName: \"kubernetes.io/projected/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-kube-api-access-qp4bg\") pod \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.947619 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-sb\") pod \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.947648 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-nb\") pod \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.947726 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-dns-svc\") pod \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\" (UID: \"0eb85480-a1d3-47cc-ba0a-d5bdc877342d\") " Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.947956 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:07 crc kubenswrapper[4907]: E1009 19:45:07.948149 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 19:45:07 crc kubenswrapper[4907]: E1009 19:45:07.948170 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 19:45:07 crc kubenswrapper[4907]: E1009 19:45:07.948221 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift podName:fd0a266f-be5f-4162-87fb-7389f11c37ab nodeName:}" failed. No retries permitted until 2025-10-09 19:45:08.948203207 +0000 UTC m=+994.480170696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift") pod "swift-storage-0" (UID: "fd0a266f-be5f-4162-87fb-7389f11c37ab") : configmap "swift-ring-files" not found Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.953305 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-kube-api-access-qp4bg" (OuterVolumeSpecName: "kube-api-access-qp4bg") pod "0eb85480-a1d3-47cc-ba0a-d5bdc877342d" (UID: "0eb85480-a1d3-47cc-ba0a-d5bdc877342d"). InnerVolumeSpecName "kube-api-access-qp4bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:07 crc kubenswrapper[4907]: I1009 19:45:07.993430 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0eb85480-a1d3-47cc-ba0a-d5bdc877342d" (UID: "0eb85480-a1d3-47cc-ba0a-d5bdc877342d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.002091 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0eb85480-a1d3-47cc-ba0a-d5bdc877342d" (UID: "0eb85480-a1d3-47cc-ba0a-d5bdc877342d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.002111 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-config" (OuterVolumeSpecName: "config") pod "0eb85480-a1d3-47cc-ba0a-d5bdc877342d" (UID: "0eb85480-a1d3-47cc-ba0a-d5bdc877342d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.006004 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0eb85480-a1d3-47cc-ba0a-d5bdc877342d" (UID: "0eb85480-a1d3-47cc-ba0a-d5bdc877342d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.049034 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.049066 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.049076 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp4bg\" (UniqueName: \"kubernetes.io/projected/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-kube-api-access-qp4bg\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.049085 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.049095 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eb85480-a1d3-47cc-ba0a-d5bdc877342d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:08 crc kubenswrapper[4907]: E1009 19:45:08.338665 4907 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.104:47294->38.102.83.104:46823: write tcp 38.102.83.104:47294->38.102.83.104:46823: write: broken pipe Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.405131 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"97065ead-95b6-46d3-ab20-a073f6b5f243","Type":"ContainerStarted","Data":"430ba0c79eb0e98775ecdd9f0a76b5da86735159de9242da748f38da99c23b83"} Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.416617 4907 generic.go:334] "Generic (PLEG): container finished" podID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" containerID="fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52" exitCode=0 Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.416832 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" event={"ID":"0eb85480-a1d3-47cc-ba0a-d5bdc877342d","Type":"ContainerDied","Data":"fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52"} Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.417002 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" event={"ID":"0eb85480-a1d3-47cc-ba0a-d5bdc877342d","Type":"ContainerDied","Data":"0879cc71f941e5ab387ea619ace304de50be42612121ebb943da0469c4356bce"} Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.417087 4907 scope.go:117] "RemoveContainer" containerID="fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.416933 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-gnmbc" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.422170 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc2mk" event={"ID":"c23c05c5-d7fc-4792-be12-5cfabce11bf7","Type":"ContainerStarted","Data":"4ee8eb790bbfb82cf9e7658d447906e04ac246d21dc646a627c9f8f1afb7807b"} Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.422350 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.445107 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gc2mk" podStartSLOduration=2.445088002 podStartE2EDuration="2.445088002s" podCreationTimestamp="2025-10-09 19:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:45:08.442216385 +0000 UTC m=+993.974183894" watchObservedRunningTime="2025-10-09 19:45:08.445088002 +0000 UTC m=+993.977055501" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.520626 4907 scope.go:117] "RemoveContainer" containerID="45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.559023 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gnmbc"] Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.564483 4907 scope.go:117] "RemoveContainer" containerID="fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52" Oct 09 19:45:08 crc kubenswrapper[4907]: E1009 19:45:08.565620 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52\": container with ID starting with fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52 not found: ID does not exist" containerID="fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.565734 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52"} err="failed to get container status \"fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52\": rpc error: code = NotFound desc = could not find container \"fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52\": container with ID starting with fdebab8fd35b90742d13673a0945b3f0a9ba48f7ab46accad501b01b45a4da52 not found: ID does not exist" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.565836 4907 scope.go:117] "RemoveContainer" containerID="45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.565965 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gnmbc"] Oct 09 19:45:08 crc kubenswrapper[4907]: E1009 19:45:08.566350 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5\": container with ID starting with 45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5 not found: ID does not exist" containerID="45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5" Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.566543 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5"} err="failed to get container status \"45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5\": rpc error: code = NotFound desc = could not find container \"45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5\": container with ID starting with 45cddfa000cc295d125c55a253932319c4f13c9d01d252b79cbed8dfa9aeefa5 not found: ID does not exist" Oct 09 19:45:08 crc kubenswrapper[4907]: W1009 19:45:08.566519 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbba198fe_5d7c_4f5c_a820_ddf9978aed83.slice/crio-2c60171dd69ac4747baf7f32c00b8f6b1b1ca17a419c579b3e8d6dff55f34cb4 WatchSource:0}: Error finding container 2c60171dd69ac4747baf7f32c00b8f6b1b1ca17a419c579b3e8d6dff55f34cb4: Status 404 returned error can't find the container with id 2c60171dd69ac4747baf7f32c00b8f6b1b1ca17a419c579b3e8d6dff55f34cb4 Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.581805 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-b2l5z"] Oct 09 19:45:08 crc kubenswrapper[4907]: I1009 19:45:08.967172 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:08 crc kubenswrapper[4907]: E1009 19:45:08.967604 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 19:45:08 crc kubenswrapper[4907]: E1009 19:45:08.967701 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 19:45:08 crc kubenswrapper[4907]: E1009 19:45:08.967818 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift podName:fd0a266f-be5f-4162-87fb-7389f11c37ab nodeName:}" failed. No retries permitted until 2025-10-09 19:45:10.967791483 +0000 UTC m=+996.499758972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift") pod "swift-storage-0" (UID: "fd0a266f-be5f-4162-87fb-7389f11c37ab") : configmap "swift-ring-files" not found Oct 09 19:45:09 crc kubenswrapper[4907]: I1009 19:45:09.161684 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" path="/var/lib/kubelet/pods/0eb85480-a1d3-47cc-ba0a-d5bdc877342d/volumes" Oct 09 19:45:09 crc kubenswrapper[4907]: I1009 19:45:09.430845 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"97065ead-95b6-46d3-ab20-a073f6b5f243","Type":"ContainerStarted","Data":"24830fe01d780e377d7b8739600c882e649aea93421eb8771ad5116996e5f532"} Oct 09 19:45:09 crc kubenswrapper[4907]: I1009 19:45:09.430923 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 09 19:45:09 crc kubenswrapper[4907]: I1009 19:45:09.432913 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-b2l5z" event={"ID":"bba198fe-5d7c-4f5c-a820-ddf9978aed83","Type":"ContainerStarted","Data":"2c60171dd69ac4747baf7f32c00b8f6b1b1ca17a419c579b3e8d6dff55f34cb4"} Oct 09 19:45:09 crc kubenswrapper[4907]: I1009 19:45:09.450323 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.044897382 podStartE2EDuration="4.450305905s" podCreationTimestamp="2025-10-09 19:45:05 +0000 UTC" firstStartedPulling="2025-10-09 19:45:06.73255088 +0000 UTC m=+992.264518369" lastFinishedPulling="2025-10-09 19:45:08.137959403 +0000 UTC m=+993.669926892" observedRunningTime="2025-10-09 19:45:09.447514231 +0000 UTC m=+994.979481720" watchObservedRunningTime="2025-10-09 19:45:09.450305905 +0000 UTC m=+994.982273394" Oct 09 19:45:09 crc kubenswrapper[4907]: I1009 19:45:09.891034 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 09 19:45:09 crc kubenswrapper[4907]: I1009 19:45:09.938070 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 09 19:45:11 crc kubenswrapper[4907]: I1009 19:45:11.015030 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:11 crc kubenswrapper[4907]: E1009 19:45:11.015221 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 19:45:11 crc kubenswrapper[4907]: E1009 19:45:11.015238 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 19:45:11 crc kubenswrapper[4907]: E1009 19:45:11.015280 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift podName:fd0a266f-be5f-4162-87fb-7389f11c37ab nodeName:}" failed. No retries permitted until 2025-10-09 19:45:15.015265815 +0000 UTC m=+1000.547233304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift") pod "swift-storage-0" (UID: "fd0a266f-be5f-4162-87fb-7389f11c37ab") : configmap "swift-ring-files" not found Oct 09 19:45:12 crc kubenswrapper[4907]: I1009 19:45:12.404910 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 09 19:45:12 crc kubenswrapper[4907]: I1009 19:45:12.452860 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 09 19:45:13 crc kubenswrapper[4907]: I1009 19:45:13.463411 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-b2l5z" event={"ID":"bba198fe-5d7c-4f5c-a820-ddf9978aed83","Type":"ContainerStarted","Data":"5836dbd7c5351e454b1f9e04fd22b70805996eebc2fe30f19e53b55fdbdaa56a"} Oct 09 19:45:13 crc kubenswrapper[4907]: I1009 19:45:13.490078 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-b2l5z" podStartSLOduration=2.7157562090000003 podStartE2EDuration="6.490047633s" podCreationTimestamp="2025-10-09 19:45:07 +0000 UTC" firstStartedPulling="2025-10-09 19:45:08.569582399 +0000 UTC m=+994.101549888" lastFinishedPulling="2025-10-09 19:45:12.343873823 +0000 UTC m=+997.875841312" observedRunningTime="2025-10-09 19:45:13.48084384 +0000 UTC m=+999.012811379" watchObservedRunningTime="2025-10-09 19:45:13.490047633 +0000 UTC m=+999.022015142" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.151738 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-49246"] Oct 09 19:45:14 crc kubenswrapper[4907]: E1009 19:45:14.152354 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" containerName="init" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.152373 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" containerName="init" Oct 09 19:45:14 crc kubenswrapper[4907]: E1009 19:45:14.152392 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" containerName="dnsmasq-dns" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.152400 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" containerName="dnsmasq-dns" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.152973 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb85480-a1d3-47cc-ba0a-d5bdc877342d" containerName="dnsmasq-dns" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.153700 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-49246" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.164638 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-49246"] Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.276417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jddj\" (UniqueName: \"kubernetes.io/projected/f011c439-2d43-4c4e-b1e2-b91c4f2d77e9-kube-api-access-2jddj\") pod \"keystone-db-create-49246\" (UID: \"f011c439-2d43-4c4e-b1e2-b91c4f2d77e9\") " pod="openstack/keystone-db-create-49246" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.372717 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-txm7b"] Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.373994 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txm7b" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.377808 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jddj\" (UniqueName: \"kubernetes.io/projected/f011c439-2d43-4c4e-b1e2-b91c4f2d77e9-kube-api-access-2jddj\") pod \"keystone-db-create-49246\" (UID: \"f011c439-2d43-4c4e-b1e2-b91c4f2d77e9\") " pod="openstack/keystone-db-create-49246" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.387648 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-txm7b"] Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.412615 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jddj\" (UniqueName: \"kubernetes.io/projected/f011c439-2d43-4c4e-b1e2-b91c4f2d77e9-kube-api-access-2jddj\") pod \"keystone-db-create-49246\" (UID: \"f011c439-2d43-4c4e-b1e2-b91c4f2d77e9\") " pod="openstack/keystone-db-create-49246" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.472388 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-49246" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.478756 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7lf\" (UniqueName: \"kubernetes.io/projected/3cab588e-6160-4912-8fde-e12b7f005984-kube-api-access-2j7lf\") pod \"placement-db-create-txm7b\" (UID: \"3cab588e-6160-4912-8fde-e12b7f005984\") " pod="openstack/placement-db-create-txm7b" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.580537 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7lf\" (UniqueName: \"kubernetes.io/projected/3cab588e-6160-4912-8fde-e12b7f005984-kube-api-access-2j7lf\") pod \"placement-db-create-txm7b\" (UID: \"3cab588e-6160-4912-8fde-e12b7f005984\") " pod="openstack/placement-db-create-txm7b" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.604334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7lf\" (UniqueName: \"kubernetes.io/projected/3cab588e-6160-4912-8fde-e12b7f005984-kube-api-access-2j7lf\") pod \"placement-db-create-txm7b\" (UID: \"3cab588e-6160-4912-8fde-e12b7f005984\") " pod="openstack/placement-db-create-txm7b" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.706732 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txm7b" Oct 09 19:45:14 crc kubenswrapper[4907]: I1009 19:45:14.976783 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-49246"] Oct 09 19:45:14 crc kubenswrapper[4907]: W1009 19:45:14.989730 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf011c439_2d43_4c4e_b1e2_b91c4f2d77e9.slice/crio-d2b11826215fa56e6b651f3625d11578c10caf91028dc833079f07e414c30d9d WatchSource:0}: Error finding container d2b11826215fa56e6b651f3625d11578c10caf91028dc833079f07e414c30d9d: Status 404 returned error can't find the container with id d2b11826215fa56e6b651f3625d11578c10caf91028dc833079f07e414c30d9d Oct 09 19:45:15 crc kubenswrapper[4907]: I1009 19:45:15.090674 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:15 crc kubenswrapper[4907]: E1009 19:45:15.090916 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 19:45:15 crc kubenswrapper[4907]: E1009 19:45:15.091099 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 19:45:15 crc kubenswrapper[4907]: E1009 19:45:15.091181 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift podName:fd0a266f-be5f-4162-87fb-7389f11c37ab nodeName:}" failed. No retries permitted until 2025-10-09 19:45:23.091138739 +0000 UTC m=+1008.623106228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift") pod "swift-storage-0" (UID: "fd0a266f-be5f-4162-87fb-7389f11c37ab") : configmap "swift-ring-files" not found Oct 09 19:45:15 crc kubenswrapper[4907]: I1009 19:45:15.272021 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-txm7b"] Oct 09 19:45:15 crc kubenswrapper[4907]: W1009 19:45:15.277133 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cab588e_6160_4912_8fde_e12b7f005984.slice/crio-0c8376bb0d28c7c48a87098c4fd5b21867497144931daaa25ddf50f8f0744ebe WatchSource:0}: Error finding container 0c8376bb0d28c7c48a87098c4fd5b21867497144931daaa25ddf50f8f0744ebe: Status 404 returned error can't find the container with id 0c8376bb0d28c7c48a87098c4fd5b21867497144931daaa25ddf50f8f0744ebe Oct 09 19:45:15 crc kubenswrapper[4907]: I1009 19:45:15.487132 4907 generic.go:334] "Generic (PLEG): container finished" podID="f011c439-2d43-4c4e-b1e2-b91c4f2d77e9" containerID="8610e92d2aa814db2029fb98bc4b4850506b32cf32dd841895a03390c370ce9f" exitCode=0 Oct 09 19:45:15 crc kubenswrapper[4907]: I1009 19:45:15.487482 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-49246" event={"ID":"f011c439-2d43-4c4e-b1e2-b91c4f2d77e9","Type":"ContainerDied","Data":"8610e92d2aa814db2029fb98bc4b4850506b32cf32dd841895a03390c370ce9f"} Oct 09 19:45:15 crc kubenswrapper[4907]: I1009 19:45:15.487520 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-49246" event={"ID":"f011c439-2d43-4c4e-b1e2-b91c4f2d77e9","Type":"ContainerStarted","Data":"d2b11826215fa56e6b651f3625d11578c10caf91028dc833079f07e414c30d9d"} Oct 09 19:45:15 crc kubenswrapper[4907]: I1009 19:45:15.489873 4907 generic.go:334] "Generic (PLEG): container finished" podID="3cab588e-6160-4912-8fde-e12b7f005984" containerID="1194eb317fc569e3644462f89cfc41d57e487cc5be070d80d3503ef05de18479" exitCode=0 Oct 09 19:45:15 crc kubenswrapper[4907]: I1009 19:45:15.489907 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txm7b" event={"ID":"3cab588e-6160-4912-8fde-e12b7f005984","Type":"ContainerDied","Data":"1194eb317fc569e3644462f89cfc41d57e487cc5be070d80d3503ef05de18479"} Oct 09 19:45:15 crc kubenswrapper[4907]: I1009 19:45:15.489928 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txm7b" event={"ID":"3cab588e-6160-4912-8fde-e12b7f005984","Type":"ContainerStarted","Data":"0c8376bb0d28c7c48a87098c4fd5b21867497144931daaa25ddf50f8f0744ebe"} Oct 09 19:45:16 crc kubenswrapper[4907]: I1009 19:45:16.422734 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:16 crc kubenswrapper[4907]: I1009 19:45:16.478557 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j7zr6"] Oct 09 19:45:16 crc kubenswrapper[4907]: I1009 19:45:16.478840 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" podUID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" containerName="dnsmasq-dns" containerID="cri-o://2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff" gracePeriod=10 Oct 09 19:45:16 crc kubenswrapper[4907]: I1009 19:45:16.867238 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txm7b" Oct 09 19:45:16 crc kubenswrapper[4907]: I1009 19:45:16.995559 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-49246" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.002313 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.033076 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7lf\" (UniqueName: \"kubernetes.io/projected/3cab588e-6160-4912-8fde-e12b7f005984-kube-api-access-2j7lf\") pod \"3cab588e-6160-4912-8fde-e12b7f005984\" (UID: \"3cab588e-6160-4912-8fde-e12b7f005984\") " Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.039393 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cab588e-6160-4912-8fde-e12b7f005984-kube-api-access-2j7lf" (OuterVolumeSpecName: "kube-api-access-2j7lf") pod "3cab588e-6160-4912-8fde-e12b7f005984" (UID: "3cab588e-6160-4912-8fde-e12b7f005984"). InnerVolumeSpecName "kube-api-access-2j7lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.135019 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jddj\" (UniqueName: \"kubernetes.io/projected/f011c439-2d43-4c4e-b1e2-b91c4f2d77e9-kube-api-access-2jddj\") pod \"f011c439-2d43-4c4e-b1e2-b91c4f2d77e9\" (UID: \"f011c439-2d43-4c4e-b1e2-b91c4f2d77e9\") " Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.135060 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-dns-svc\") pod \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.135135 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-config\") pod \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.135162 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qssz\" (UniqueName: \"kubernetes.io/projected/eb8b4400-e16b-45de-b19a-0d312e4f1e51-kube-api-access-9qssz\") pod \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\" (UID: \"eb8b4400-e16b-45de-b19a-0d312e4f1e51\") " Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.135546 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7lf\" (UniqueName: \"kubernetes.io/projected/3cab588e-6160-4912-8fde-e12b7f005984-kube-api-access-2j7lf\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.138041 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8b4400-e16b-45de-b19a-0d312e4f1e51-kube-api-access-9qssz" (OuterVolumeSpecName: "kube-api-access-9qssz") pod "eb8b4400-e16b-45de-b19a-0d312e4f1e51" (UID: "eb8b4400-e16b-45de-b19a-0d312e4f1e51"). InnerVolumeSpecName "kube-api-access-9qssz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.140881 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f011c439-2d43-4c4e-b1e2-b91c4f2d77e9-kube-api-access-2jddj" (OuterVolumeSpecName: "kube-api-access-2jddj") pod "f011c439-2d43-4c4e-b1e2-b91c4f2d77e9" (UID: "f011c439-2d43-4c4e-b1e2-b91c4f2d77e9"). InnerVolumeSpecName "kube-api-access-2jddj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.177181 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-config" (OuterVolumeSpecName: "config") pod "eb8b4400-e16b-45de-b19a-0d312e4f1e51" (UID: "eb8b4400-e16b-45de-b19a-0d312e4f1e51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.192544 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb8b4400-e16b-45de-b19a-0d312e4f1e51" (UID: "eb8b4400-e16b-45de-b19a-0d312e4f1e51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.236719 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jddj\" (UniqueName: \"kubernetes.io/projected/f011c439-2d43-4c4e-b1e2-b91c4f2d77e9-kube-api-access-2jddj\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.236765 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.236776 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8b4400-e16b-45de-b19a-0d312e4f1e51-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.236787 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qssz\" (UniqueName: \"kubernetes.io/projected/eb8b4400-e16b-45de-b19a-0d312e4f1e51-kube-api-access-9qssz\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.507933 4907 generic.go:334] "Generic (PLEG): container finished" podID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" containerID="2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff" exitCode=0 Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.507971 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.508007 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" event={"ID":"eb8b4400-e16b-45de-b19a-0d312e4f1e51","Type":"ContainerDied","Data":"2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff"} Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.508044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-j7zr6" event={"ID":"eb8b4400-e16b-45de-b19a-0d312e4f1e51","Type":"ContainerDied","Data":"c6f5d8934e48af883764ed2268d945208e2234964672f2008ddbe0503eaf14cb"} Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.508075 4907 scope.go:117] "RemoveContainer" containerID="2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.509409 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-49246" event={"ID":"f011c439-2d43-4c4e-b1e2-b91c4f2d77e9","Type":"ContainerDied","Data":"d2b11826215fa56e6b651f3625d11578c10caf91028dc833079f07e414c30d9d"} Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.509438 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2b11826215fa56e6b651f3625d11578c10caf91028dc833079f07e414c30d9d" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.509496 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-49246" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.511724 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txm7b" event={"ID":"3cab588e-6160-4912-8fde-e12b7f005984","Type":"ContainerDied","Data":"0c8376bb0d28c7c48a87098c4fd5b21867497144931daaa25ddf50f8f0744ebe"} Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.512008 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c8376bb0d28c7c48a87098c4fd5b21867497144931daaa25ddf50f8f0744ebe" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.511799 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txm7b" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.534025 4907 scope.go:117] "RemoveContainer" containerID="46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.546606 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j7zr6"] Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.553985 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-j7zr6"] Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.560270 4907 scope.go:117] "RemoveContainer" containerID="2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff" Oct 09 19:45:17 crc kubenswrapper[4907]: E1009 19:45:17.560610 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff\": container with ID starting with 2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff not found: ID does not exist" containerID="2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.560646 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff"} err="failed to get container status \"2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff\": rpc error: code = NotFound desc = could not find container \"2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff\": container with ID starting with 2cb4309cc469cbfacd58ac48a7f216cab22e0fbe9e411f29f3c2d7fb94a102ff not found: ID does not exist" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.560668 4907 scope.go:117] "RemoveContainer" containerID="46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604" Oct 09 19:45:17 crc kubenswrapper[4907]: E1009 19:45:17.560891 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604\": container with ID starting with 46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604 not found: ID does not exist" containerID="46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604" Oct 09 19:45:17 crc kubenswrapper[4907]: I1009 19:45:17.560924 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604"} err="failed to get container status \"46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604\": rpc error: code = NotFound desc = could not find container \"46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604\": container with ID starting with 46bf38bb42fea9dad55fb1332b4413a63b547e4127deb7d2715fa78183b2d604 not found: ID does not exist" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.179570 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" path="/var/lib/kubelet/pods/eb8b4400-e16b-45de-b19a-0d312e4f1e51/volumes" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.799302 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-58csx"] Oct 09 19:45:19 crc kubenswrapper[4907]: E1009 19:45:19.800211 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" containerName="init" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.800311 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" containerName="init" Oct 09 19:45:19 crc kubenswrapper[4907]: E1009 19:45:19.800421 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f011c439-2d43-4c4e-b1e2-b91c4f2d77e9" containerName="mariadb-database-create" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.800518 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f011c439-2d43-4c4e-b1e2-b91c4f2d77e9" containerName="mariadb-database-create" Oct 09 19:45:19 crc kubenswrapper[4907]: E1009 19:45:19.800611 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cab588e-6160-4912-8fde-e12b7f005984" containerName="mariadb-database-create" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.800692 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cab588e-6160-4912-8fde-e12b7f005984" containerName="mariadb-database-create" Oct 09 19:45:19 crc kubenswrapper[4907]: E1009 19:45:19.800779 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" containerName="dnsmasq-dns" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.800849 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" containerName="dnsmasq-dns" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.801142 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f011c439-2d43-4c4e-b1e2-b91c4f2d77e9" containerName="mariadb-database-create" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.801254 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cab588e-6160-4912-8fde-e12b7f005984" containerName="mariadb-database-create" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.801379 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8b4400-e16b-45de-b19a-0d312e4f1e51" containerName="dnsmasq-dns" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.802693 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-58csx" Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.815518 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-58csx"] Oct 09 19:45:19 crc kubenswrapper[4907]: I1009 19:45:19.991109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vssc\" (UniqueName: \"kubernetes.io/projected/4ec9187c-1e24-4ef9-aff7-3b2391d23822-kube-api-access-5vssc\") pod \"glance-db-create-58csx\" (UID: \"4ec9187c-1e24-4ef9-aff7-3b2391d23822\") " pod="openstack/glance-db-create-58csx" Oct 09 19:45:20 crc kubenswrapper[4907]: I1009 19:45:20.092753 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vssc\" (UniqueName: \"kubernetes.io/projected/4ec9187c-1e24-4ef9-aff7-3b2391d23822-kube-api-access-5vssc\") pod \"glance-db-create-58csx\" (UID: \"4ec9187c-1e24-4ef9-aff7-3b2391d23822\") " pod="openstack/glance-db-create-58csx" Oct 09 19:45:20 crc kubenswrapper[4907]: I1009 19:45:20.125134 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vssc\" (UniqueName: \"kubernetes.io/projected/4ec9187c-1e24-4ef9-aff7-3b2391d23822-kube-api-access-5vssc\") pod \"glance-db-create-58csx\" (UID: \"4ec9187c-1e24-4ef9-aff7-3b2391d23822\") " pod="openstack/glance-db-create-58csx" Oct 09 19:45:20 crc kubenswrapper[4907]: I1009 19:45:20.423409 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-58csx" Oct 09 19:45:20 crc kubenswrapper[4907]: I1009 19:45:20.559621 4907 generic.go:334] "Generic (PLEG): container finished" podID="bba198fe-5d7c-4f5c-a820-ddf9978aed83" containerID="5836dbd7c5351e454b1f9e04fd22b70805996eebc2fe30f19e53b55fdbdaa56a" exitCode=0 Oct 09 19:45:20 crc kubenswrapper[4907]: I1009 19:45:20.559737 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-b2l5z" event={"ID":"bba198fe-5d7c-4f5c-a820-ddf9978aed83","Type":"ContainerDied","Data":"5836dbd7c5351e454b1f9e04fd22b70805996eebc2fe30f19e53b55fdbdaa56a"} Oct 09 19:45:20 crc kubenswrapper[4907]: I1009 19:45:20.936771 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-58csx"] Oct 09 19:45:20 crc kubenswrapper[4907]: W1009 19:45:20.941139 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ec9187c_1e24_4ef9_aff7_3b2391d23822.slice/crio-b9e4510d00b58f706f071dd0dfaab21fec01dcbdc4dd6563e66a22e5c7ee9268 WatchSource:0}: Error finding container b9e4510d00b58f706f071dd0dfaab21fec01dcbdc4dd6563e66a22e5c7ee9268: Status 404 returned error can't find the container with id b9e4510d00b58f706f071dd0dfaab21fec01dcbdc4dd6563e66a22e5c7ee9268 Oct 09 19:45:21 crc kubenswrapper[4907]: I1009 19:45:21.094726 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 09 19:45:21 crc kubenswrapper[4907]: I1009 19:45:21.568524 4907 generic.go:334] "Generic (PLEG): container finished" podID="4ec9187c-1e24-4ef9-aff7-3b2391d23822" containerID="9d4337ef10824f1f867f477fb038d7bfb4c7be79845b4c70a00b3a7fe1f7a460" exitCode=0 Oct 09 19:45:21 crc kubenswrapper[4907]: I1009 19:45:21.568580 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-58csx" event={"ID":"4ec9187c-1e24-4ef9-aff7-3b2391d23822","Type":"ContainerDied","Data":"9d4337ef10824f1f867f477fb038d7bfb4c7be79845b4c70a00b3a7fe1f7a460"} Oct 09 19:45:21 crc kubenswrapper[4907]: I1009 19:45:21.568632 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-58csx" event={"ID":"4ec9187c-1e24-4ef9-aff7-3b2391d23822","Type":"ContainerStarted","Data":"b9e4510d00b58f706f071dd0dfaab21fec01dcbdc4dd6563e66a22e5c7ee9268"} Oct 09 19:45:21 crc kubenswrapper[4907]: I1009 19:45:21.899199 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.040781 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-scripts\") pod \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.040850 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-swiftconf\") pod \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.040910 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn7gh\" (UniqueName: \"kubernetes.io/projected/bba198fe-5d7c-4f5c-a820-ddf9978aed83-kube-api-access-zn7gh\") pod \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.040953 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-ring-data-devices\") pod \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.041021 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-dispersionconf\") pod \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.041111 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bba198fe-5d7c-4f5c-a820-ddf9978aed83-etc-swift\") pod \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.041156 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-combined-ca-bundle\") pod \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\" (UID: \"bba198fe-5d7c-4f5c-a820-ddf9978aed83\") " Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.042381 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bba198fe-5d7c-4f5c-a820-ddf9978aed83" (UID: "bba198fe-5d7c-4f5c-a820-ddf9978aed83"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.042494 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bba198fe-5d7c-4f5c-a820-ddf9978aed83-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bba198fe-5d7c-4f5c-a820-ddf9978aed83" (UID: "bba198fe-5d7c-4f5c-a820-ddf9978aed83"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.046568 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba198fe-5d7c-4f5c-a820-ddf9978aed83-kube-api-access-zn7gh" (OuterVolumeSpecName: "kube-api-access-zn7gh") pod "bba198fe-5d7c-4f5c-a820-ddf9978aed83" (UID: "bba198fe-5d7c-4f5c-a820-ddf9978aed83"). InnerVolumeSpecName "kube-api-access-zn7gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.052594 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bba198fe-5d7c-4f5c-a820-ddf9978aed83" (UID: "bba198fe-5d7c-4f5c-a820-ddf9978aed83"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.071076 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-scripts" (OuterVolumeSpecName: "scripts") pod "bba198fe-5d7c-4f5c-a820-ddf9978aed83" (UID: "bba198fe-5d7c-4f5c-a820-ddf9978aed83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.073999 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bba198fe-5d7c-4f5c-a820-ddf9978aed83" (UID: "bba198fe-5d7c-4f5c-a820-ddf9978aed83"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.076128 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bba198fe-5d7c-4f5c-a820-ddf9978aed83" (UID: "bba198fe-5d7c-4f5c-a820-ddf9978aed83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.155354 4907 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.155379 4907 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bba198fe-5d7c-4f5c-a820-ddf9978aed83-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.155388 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.155400 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.155408 4907 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bba198fe-5d7c-4f5c-a820-ddf9978aed83-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.155416 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn7gh\" (UniqueName: \"kubernetes.io/projected/bba198fe-5d7c-4f5c-a820-ddf9978aed83-kube-api-access-zn7gh\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.155450 4907 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bba198fe-5d7c-4f5c-a820-ddf9978aed83-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.578725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-b2l5z" event={"ID":"bba198fe-5d7c-4f5c-a820-ddf9978aed83","Type":"ContainerDied","Data":"2c60171dd69ac4747baf7f32c00b8f6b1b1ca17a419c579b3e8d6dff55f34cb4"} Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.578770 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c60171dd69ac4747baf7f32c00b8f6b1b1ca17a419c579b3e8d6dff55f34cb4" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.578780 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b2l5z" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.879936 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-58csx" Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.978978 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vssc\" (UniqueName: \"kubernetes.io/projected/4ec9187c-1e24-4ef9-aff7-3b2391d23822-kube-api-access-5vssc\") pod \"4ec9187c-1e24-4ef9-aff7-3b2391d23822\" (UID: \"4ec9187c-1e24-4ef9-aff7-3b2391d23822\") " Oct 09 19:45:22 crc kubenswrapper[4907]: I1009 19:45:22.984725 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec9187c-1e24-4ef9-aff7-3b2391d23822-kube-api-access-5vssc" (OuterVolumeSpecName: "kube-api-access-5vssc") pod "4ec9187c-1e24-4ef9-aff7-3b2391d23822" (UID: "4ec9187c-1e24-4ef9-aff7-3b2391d23822"). InnerVolumeSpecName "kube-api-access-5vssc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:23 crc kubenswrapper[4907]: I1009 19:45:23.081074 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vssc\" (UniqueName: \"kubernetes.io/projected/4ec9187c-1e24-4ef9-aff7-3b2391d23822-kube-api-access-5vssc\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:23 crc kubenswrapper[4907]: I1009 19:45:23.182615 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:23 crc kubenswrapper[4907]: I1009 19:45:23.186872 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fd0a266f-be5f-4162-87fb-7389f11c37ab-etc-swift\") pod \"swift-storage-0\" (UID: \"fd0a266f-be5f-4162-87fb-7389f11c37ab\") " pod="openstack/swift-storage-0" Oct 09 19:45:23 crc kubenswrapper[4907]: I1009 19:45:23.231959 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 19:45:23 crc kubenswrapper[4907]: I1009 19:45:23.588580 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-58csx" Oct 09 19:45:23 crc kubenswrapper[4907]: I1009 19:45:23.588579 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-58csx" event={"ID":"4ec9187c-1e24-4ef9-aff7-3b2391d23822","Type":"ContainerDied","Data":"b9e4510d00b58f706f071dd0dfaab21fec01dcbdc4dd6563e66a22e5c7ee9268"} Oct 09 19:45:23 crc kubenswrapper[4907]: I1009 19:45:23.588968 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9e4510d00b58f706f071dd0dfaab21fec01dcbdc4dd6563e66a22e5c7ee9268" Oct 09 19:45:23 crc kubenswrapper[4907]: I1009 19:45:23.790267 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 19:45:23 crc kubenswrapper[4907]: W1009 19:45:23.790892 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd0a266f_be5f_4162_87fb_7389f11c37ab.slice/crio-f5682dadd219886e34ececcd7402d240c3939cf6c8f4524d57a987487dc7872c WatchSource:0}: Error finding container f5682dadd219886e34ececcd7402d240c3939cf6c8f4524d57a987487dc7872c: Status 404 returned error can't find the container with id f5682dadd219886e34ececcd7402d240c3939cf6c8f4524d57a987487dc7872c Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.191744 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b919-account-create-znlsd"] Oct 09 19:45:24 crc kubenswrapper[4907]: E1009 19:45:24.192771 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec9187c-1e24-4ef9-aff7-3b2391d23822" containerName="mariadb-database-create" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.192822 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec9187c-1e24-4ef9-aff7-3b2391d23822" containerName="mariadb-database-create" Oct 09 19:45:24 crc kubenswrapper[4907]: E1009 19:45:24.192851 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba198fe-5d7c-4f5c-a820-ddf9978aed83" containerName="swift-ring-rebalance" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.192868 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba198fe-5d7c-4f5c-a820-ddf9978aed83" containerName="swift-ring-rebalance" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.193273 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba198fe-5d7c-4f5c-a820-ddf9978aed83" containerName="swift-ring-rebalance" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.193338 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec9187c-1e24-4ef9-aff7-3b2391d23822" containerName="mariadb-database-create" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.194551 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b919-account-create-znlsd" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.198027 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.209599 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b919-account-create-znlsd"] Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.299149 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq9d4\" (UniqueName: \"kubernetes.io/projected/371529d4-9ff3-4104-badb-d3777b626f91-kube-api-access-zq9d4\") pod \"keystone-b919-account-create-znlsd\" (UID: \"371529d4-9ff3-4104-badb-d3777b626f91\") " pod="openstack/keystone-b919-account-create-znlsd" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.401230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq9d4\" (UniqueName: \"kubernetes.io/projected/371529d4-9ff3-4104-badb-d3777b626f91-kube-api-access-zq9d4\") pod \"keystone-b919-account-create-znlsd\" (UID: \"371529d4-9ff3-4104-badb-d3777b626f91\") " pod="openstack/keystone-b919-account-create-znlsd" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.424319 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq9d4\" (UniqueName: \"kubernetes.io/projected/371529d4-9ff3-4104-badb-d3777b626f91-kube-api-access-zq9d4\") pod \"keystone-b919-account-create-znlsd\" (UID: \"371529d4-9ff3-4104-badb-d3777b626f91\") " pod="openstack/keystone-b919-account-create-znlsd" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.494817 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a117-account-create-g9ntg"] Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.496433 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a117-account-create-g9ntg" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.498650 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.506812 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a117-account-create-g9ntg"] Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.519243 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b919-account-create-znlsd" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.597586 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"f5682dadd219886e34ececcd7402d240c3939cf6c8f4524d57a987487dc7872c"} Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.603859 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlnqk\" (UniqueName: \"kubernetes.io/projected/0ec0c091-4cc8-48a7-a072-fdea2cdc1125-kube-api-access-mlnqk\") pod \"placement-a117-account-create-g9ntg\" (UID: \"0ec0c091-4cc8-48a7-a072-fdea2cdc1125\") " pod="openstack/placement-a117-account-create-g9ntg" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.705428 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlnqk\" (UniqueName: \"kubernetes.io/projected/0ec0c091-4cc8-48a7-a072-fdea2cdc1125-kube-api-access-mlnqk\") pod \"placement-a117-account-create-g9ntg\" (UID: \"0ec0c091-4cc8-48a7-a072-fdea2cdc1125\") " pod="openstack/placement-a117-account-create-g9ntg" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.750873 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlnqk\" (UniqueName: \"kubernetes.io/projected/0ec0c091-4cc8-48a7-a072-fdea2cdc1125-kube-api-access-mlnqk\") pod \"placement-a117-account-create-g9ntg\" (UID: \"0ec0c091-4cc8-48a7-a072-fdea2cdc1125\") " pod="openstack/placement-a117-account-create-g9ntg" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.822664 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a117-account-create-g9ntg" Oct 09 19:45:24 crc kubenswrapper[4907]: I1009 19:45:24.999447 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b919-account-create-znlsd"] Oct 09 19:45:25 crc kubenswrapper[4907]: W1009 19:45:25.004756 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod371529d4_9ff3_4104_badb_d3777b626f91.slice/crio-df61175b07c0b900fdb4195d41853f1df30b5bc55033b0f64226d4d914d5fe31 WatchSource:0}: Error finding container df61175b07c0b900fdb4195d41853f1df30b5bc55033b0f64226d4d914d5fe31: Status 404 returned error can't find the container with id df61175b07c0b900fdb4195d41853f1df30b5bc55033b0f64226d4d914d5fe31 Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.191944 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dz7f2" podUID="c9bf7943-cd49-4a26-83e2-9efc4c9dcc02" containerName="ovn-controller" probeResult="failure" output=< Oct 09 19:45:25 crc kubenswrapper[4907]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 19:45:25 crc kubenswrapper[4907]: > Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.252374 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.253835 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9z259" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.283338 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a117-account-create-g9ntg"] Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.533250 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dz7f2-config-gm8hf"] Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.535380 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.537496 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.550834 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2-config-gm8hf"] Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.607844 4907 generic.go:334] "Generic (PLEG): container finished" podID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerID="6e458b11269c1cb13b676c476b97734eb508c7576d9ba6e4b28cb7cd48f34f70" exitCode=0 Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.607885 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c","Type":"ContainerDied","Data":"6e458b11269c1cb13b676c476b97734eb508c7576d9ba6e4b28cb7cd48f34f70"} Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.614205 4907 generic.go:334] "Generic (PLEG): container finished" podID="371529d4-9ff3-4104-badb-d3777b626f91" containerID="079c9991cda657e601f75ab8f5bd4c8f61086072d411cea7a3eb090d2b7083c8" exitCode=0 Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.614387 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b919-account-create-znlsd" event={"ID":"371529d4-9ff3-4104-badb-d3777b626f91","Type":"ContainerDied","Data":"079c9991cda657e601f75ab8f5bd4c8f61086072d411cea7a3eb090d2b7083c8"} Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.614490 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b919-account-create-znlsd" event={"ID":"371529d4-9ff3-4104-badb-d3777b626f91","Type":"ContainerStarted","Data":"df61175b07c0b900fdb4195d41853f1df30b5bc55033b0f64226d4d914d5fe31"} Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.622448 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kch7n\" (UniqueName: \"kubernetes.io/projected/8ae9b1ff-2630-46e5-a242-4932ff7275b2-kube-api-access-kch7n\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.622546 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-scripts\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.622609 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run-ovn\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.622635 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-log-ovn\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.622659 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.622674 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-additional-scripts\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.723833 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kch7n\" (UniqueName: \"kubernetes.io/projected/8ae9b1ff-2630-46e5-a242-4932ff7275b2-kube-api-access-kch7n\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.723887 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-scripts\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.724029 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run-ovn\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.724091 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-log-ovn\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.724154 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.724185 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-additional-scripts\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.724343 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-log-ovn\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.724907 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-additional-scripts\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.724973 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.725162 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run-ovn\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.726984 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-scripts\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.740425 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kch7n\" (UniqueName: \"kubernetes.io/projected/8ae9b1ff-2630-46e5-a242-4932ff7275b2-kube-api-access-kch7n\") pod \"ovn-controller-dz7f2-config-gm8hf\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:25 crc kubenswrapper[4907]: I1009 19:45:25.869050 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.204864 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2-config-gm8hf"] Oct 09 19:45:26 crc kubenswrapper[4907]: W1009 19:45:26.212835 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae9b1ff_2630_46e5_a242_4932ff7275b2.slice/crio-cdb6258c187e7825cbc66dc55a790d64c8b65675bbe976ef30c2fca8b738011f WatchSource:0}: Error finding container cdb6258c187e7825cbc66dc55a790d64c8b65675bbe976ef30c2fca8b738011f: Status 404 returned error can't find the container with id cdb6258c187e7825cbc66dc55a790d64c8b65675bbe976ef30c2fca8b738011f Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.621367 4907 generic.go:334] "Generic (PLEG): container finished" podID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerID="90127cac8301cedbb378bf90ca8583c35071b8c22a6f7cf9a0d8fbe7ef6cfe53" exitCode=0 Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.621457 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05cb258e-fa1a-4978-b143-d6c817ec0f96","Type":"ContainerDied","Data":"90127cac8301cedbb378bf90ca8583c35071b8c22a6f7cf9a0d8fbe7ef6cfe53"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.623303 4907 generic.go:334] "Generic (PLEG): container finished" podID="0ec0c091-4cc8-48a7-a072-fdea2cdc1125" containerID="120bdb60718844df677552c6740ed660025280a11f54a1995b04d679f88585f0" exitCode=0 Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.623350 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a117-account-create-g9ntg" event={"ID":"0ec0c091-4cc8-48a7-a072-fdea2cdc1125","Type":"ContainerDied","Data":"120bdb60718844df677552c6740ed660025280a11f54a1995b04d679f88585f0"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.623368 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a117-account-create-g9ntg" event={"ID":"0ec0c091-4cc8-48a7-a072-fdea2cdc1125","Type":"ContainerStarted","Data":"26e3237f0efdea3ad8f009703430d673d1bfbf6a84eb840976d7fce782688e3f"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.624953 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-gm8hf" event={"ID":"8ae9b1ff-2630-46e5-a242-4932ff7275b2","Type":"ContainerStarted","Data":"4bb03cbd245283b2571c1bdc5bad5e7f5d6c328f8131dea31949fe6744c6042f"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.625061 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-gm8hf" event={"ID":"8ae9b1ff-2630-46e5-a242-4932ff7275b2","Type":"ContainerStarted","Data":"cdb6258c187e7825cbc66dc55a790d64c8b65675bbe976ef30c2fca8b738011f"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.626920 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c","Type":"ContainerStarted","Data":"1e10ff7f190df66a67d83fbbedaba5e91ed8f3fe46703f27d608f919ee5d8219"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.627372 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.630270 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"b7ab28d320eb0381d814929f04fb6416047910779f916af646d7ffeb9528a425"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.630296 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"687200f7b9ac8cdb02f94cfffd0abb1a5842e96c82cc95419817f88c43b1c8e9"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.630316 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"5f40e736eaa29cfecb57a6e5d766d0be0af45422a48c5ecc886a986f6504d735"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.630325 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"275ea57389349ae66eb2df7176b5dfd24d3dc10cdc3253495529154122071d08"} Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.676675 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.178424268 podStartE2EDuration="57.67666059s" podCreationTimestamp="2025-10-09 19:44:29 +0000 UTC" firstStartedPulling="2025-10-09 19:44:42.912317178 +0000 UTC m=+968.444284667" lastFinishedPulling="2025-10-09 19:44:51.4105535 +0000 UTC m=+976.942520989" observedRunningTime="2025-10-09 19:45:26.67579098 +0000 UTC m=+1012.207758489" watchObservedRunningTime="2025-10-09 19:45:26.67666059 +0000 UTC m=+1012.208628079" Oct 09 19:45:26 crc kubenswrapper[4907]: I1009 19:45:26.697886 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dz7f2-config-gm8hf" podStartSLOduration=1.6978693 podStartE2EDuration="1.6978693s" podCreationTimestamp="2025-10-09 19:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:45:26.692610099 +0000 UTC m=+1012.224577588" watchObservedRunningTime="2025-10-09 19:45:26.6978693 +0000 UTC m=+1012.229836789" Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.025379 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b919-account-create-znlsd" Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.146980 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq9d4\" (UniqueName: \"kubernetes.io/projected/371529d4-9ff3-4104-badb-d3777b626f91-kube-api-access-zq9d4\") pod \"371529d4-9ff3-4104-badb-d3777b626f91\" (UID: \"371529d4-9ff3-4104-badb-d3777b626f91\") " Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.157047 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/371529d4-9ff3-4104-badb-d3777b626f91-kube-api-access-zq9d4" (OuterVolumeSpecName: "kube-api-access-zq9d4") pod "371529d4-9ff3-4104-badb-d3777b626f91" (UID: "371529d4-9ff3-4104-badb-d3777b626f91"). InnerVolumeSpecName "kube-api-access-zq9d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.249198 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq9d4\" (UniqueName: \"kubernetes.io/projected/371529d4-9ff3-4104-badb-d3777b626f91-kube-api-access-zq9d4\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.640230 4907 generic.go:334] "Generic (PLEG): container finished" podID="8ae9b1ff-2630-46e5-a242-4932ff7275b2" containerID="4bb03cbd245283b2571c1bdc5bad5e7f5d6c328f8131dea31949fe6744c6042f" exitCode=0 Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.640301 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-gm8hf" event={"ID":"8ae9b1ff-2630-46e5-a242-4932ff7275b2","Type":"ContainerDied","Data":"4bb03cbd245283b2571c1bdc5bad5e7f5d6c328f8131dea31949fe6744c6042f"} Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.642167 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b919-account-create-znlsd" event={"ID":"371529d4-9ff3-4104-badb-d3777b626f91","Type":"ContainerDied","Data":"df61175b07c0b900fdb4195d41853f1df30b5bc55033b0f64226d4d914d5fe31"} Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.642193 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df61175b07c0b900fdb4195d41853f1df30b5bc55033b0f64226d4d914d5fe31" Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.642225 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b919-account-create-znlsd" Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.646314 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05cb258e-fa1a-4978-b143-d6c817ec0f96","Type":"ContainerStarted","Data":"60569705daf757c38afdbbef43fbd00d1607a2a4646f9e9d0d488856f660de4b"} Oct 09 19:45:27 crc kubenswrapper[4907]: I1009 19:45:27.687483 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.739556371 podStartE2EDuration="58.687435421s" podCreationTimestamp="2025-10-09 19:44:29 +0000 UTC" firstStartedPulling="2025-10-09 19:44:42.910030939 +0000 UTC m=+968.441998428" lastFinishedPulling="2025-10-09 19:44:51.857909989 +0000 UTC m=+977.389877478" observedRunningTime="2025-10-09 19:45:27.685721982 +0000 UTC m=+1013.217689511" watchObservedRunningTime="2025-10-09 19:45:27.687435421 +0000 UTC m=+1013.219402910" Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.172285 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a117-account-create-g9ntg" Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.267338 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlnqk\" (UniqueName: \"kubernetes.io/projected/0ec0c091-4cc8-48a7-a072-fdea2cdc1125-kube-api-access-mlnqk\") pod \"0ec0c091-4cc8-48a7-a072-fdea2cdc1125\" (UID: \"0ec0c091-4cc8-48a7-a072-fdea2cdc1125\") " Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.275159 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec0c091-4cc8-48a7-a072-fdea2cdc1125-kube-api-access-mlnqk" (OuterVolumeSpecName: "kube-api-access-mlnqk") pod "0ec0c091-4cc8-48a7-a072-fdea2cdc1125" (UID: "0ec0c091-4cc8-48a7-a072-fdea2cdc1125"). InnerVolumeSpecName "kube-api-access-mlnqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.369809 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlnqk\" (UniqueName: \"kubernetes.io/projected/0ec0c091-4cc8-48a7-a072-fdea2cdc1125-kube-api-access-mlnqk\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.677104 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a117-account-create-g9ntg" event={"ID":"0ec0c091-4cc8-48a7-a072-fdea2cdc1125","Type":"ContainerDied","Data":"26e3237f0efdea3ad8f009703430d673d1bfbf6a84eb840976d7fce782688e3f"} Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.677425 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26e3237f0efdea3ad8f009703430d673d1bfbf6a84eb840976d7fce782688e3f" Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.677499 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a117-account-create-g9ntg" Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.692754 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"76984e4144367f18e951576c6e4e3adb47a6ab4d65bee1f6099b182a512ea66e"} Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.692792 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"e43197d57d1cb6fcfd14eb066a695b16adb83b988ea98c912c1b8f9d3e9e09f3"} Oct 09 19:45:28 crc kubenswrapper[4907]: I1009 19:45:28.692803 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"eae457183244c550e32cf30acf3b114f31a2563c52ff180417009dc0029a3f84"} Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.027169 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.093328 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run-ovn\") pod \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.093411 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kch7n\" (UniqueName: \"kubernetes.io/projected/8ae9b1ff-2630-46e5-a242-4932ff7275b2-kube-api-access-kch7n\") pod \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.093429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run\") pod \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.093444 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8ae9b1ff-2630-46e5-a242-4932ff7275b2" (UID: "8ae9b1ff-2630-46e5-a242-4932ff7275b2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.093505 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-additional-scripts\") pod \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.093551 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-log-ovn\") pod \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.093584 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-scripts\") pod \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\" (UID: \"8ae9b1ff-2630-46e5-a242-4932ff7275b2\") " Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.093891 4907 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.094239 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8ae9b1ff-2630-46e5-a242-4932ff7275b2" (UID: "8ae9b1ff-2630-46e5-a242-4932ff7275b2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.094292 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run" (OuterVolumeSpecName: "var-run") pod "8ae9b1ff-2630-46e5-a242-4932ff7275b2" (UID: "8ae9b1ff-2630-46e5-a242-4932ff7275b2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.094749 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8ae9b1ff-2630-46e5-a242-4932ff7275b2" (UID: "8ae9b1ff-2630-46e5-a242-4932ff7275b2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.094916 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-scripts" (OuterVolumeSpecName: "scripts") pod "8ae9b1ff-2630-46e5-a242-4932ff7275b2" (UID: "8ae9b1ff-2630-46e5-a242-4932ff7275b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.100714 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae9b1ff-2630-46e5-a242-4932ff7275b2-kube-api-access-kch7n" (OuterVolumeSpecName: "kube-api-access-kch7n") pod "8ae9b1ff-2630-46e5-a242-4932ff7275b2" (UID: "8ae9b1ff-2630-46e5-a242-4932ff7275b2"). InnerVolumeSpecName "kube-api-access-kch7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.195718 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kch7n\" (UniqueName: \"kubernetes.io/projected/8ae9b1ff-2630-46e5-a242-4932ff7275b2-kube-api-access-kch7n\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.195755 4907 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.195767 4907 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.195775 4907 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ae9b1ff-2630-46e5-a242-4932ff7275b2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.195783 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ae9b1ff-2630-46e5-a242-4932ff7275b2-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.701808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-gm8hf" event={"ID":"8ae9b1ff-2630-46e5-a242-4932ff7275b2","Type":"ContainerDied","Data":"cdb6258c187e7825cbc66dc55a790d64c8b65675bbe976ef30c2fca8b738011f"} Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.701826 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-gm8hf" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.701848 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdb6258c187e7825cbc66dc55a790d64c8b65675bbe976ef30c2fca8b738011f" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.706098 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"cabc1630bb2f48b72b1f1cb8660f441cbee1bee48129cc3c393668e3fed7710b"} Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.827617 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a445-account-create-8pwq4"] Oct 09 19:45:29 crc kubenswrapper[4907]: E1009 19:45:29.827960 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec0c091-4cc8-48a7-a072-fdea2cdc1125" containerName="mariadb-account-create" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.827976 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec0c091-4cc8-48a7-a072-fdea2cdc1125" containerName="mariadb-account-create" Oct 09 19:45:29 crc kubenswrapper[4907]: E1009 19:45:29.827993 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="371529d4-9ff3-4104-badb-d3777b626f91" containerName="mariadb-account-create" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.828000 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="371529d4-9ff3-4104-badb-d3777b626f91" containerName="mariadb-account-create" Oct 09 19:45:29 crc kubenswrapper[4907]: E1009 19:45:29.828019 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae9b1ff-2630-46e5-a242-4932ff7275b2" containerName="ovn-config" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.828025 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae9b1ff-2630-46e5-a242-4932ff7275b2" containerName="ovn-config" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.828181 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="371529d4-9ff3-4104-badb-d3777b626f91" containerName="mariadb-account-create" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.828191 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae9b1ff-2630-46e5-a242-4932ff7275b2" containerName="ovn-config" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.828211 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec0c091-4cc8-48a7-a072-fdea2cdc1125" containerName="mariadb-account-create" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.828717 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a445-account-create-8pwq4" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.834251 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.843231 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a445-account-create-8pwq4"] Oct 09 19:45:29 crc kubenswrapper[4907]: I1009 19:45:29.907153 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlrwm\" (UniqueName: \"kubernetes.io/projected/9a8729b0-9a3a-4ecf-8167-430807adc85a-kube-api-access-qlrwm\") pod \"glance-a445-account-create-8pwq4\" (UID: \"9a8729b0-9a3a-4ecf-8167-430807adc85a\") " pod="openstack/glance-a445-account-create-8pwq4" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.008560 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlrwm\" (UniqueName: \"kubernetes.io/projected/9a8729b0-9a3a-4ecf-8167-430807adc85a-kube-api-access-qlrwm\") pod \"glance-a445-account-create-8pwq4\" (UID: \"9a8729b0-9a3a-4ecf-8167-430807adc85a\") " pod="openstack/glance-a445-account-create-8pwq4" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.038294 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlrwm\" (UniqueName: \"kubernetes.io/projected/9a8729b0-9a3a-4ecf-8167-430807adc85a-kube-api-access-qlrwm\") pod \"glance-a445-account-create-8pwq4\" (UID: \"9a8729b0-9a3a-4ecf-8167-430807adc85a\") " pod="openstack/glance-a445-account-create-8pwq4" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.144889 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a445-account-create-8pwq4" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.182122 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dz7f2" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.269910 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dz7f2-config-gm8hf"] Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.287885 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dz7f2-config-gm8hf"] Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.319777 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dz7f2-config-jvx9f"] Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.321379 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.330782 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2-config-jvx9f"] Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.333185 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.414882 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-scripts\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.414923 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dkj\" (UniqueName: \"kubernetes.io/projected/c50045b0-274e-48d3-a5db-64689efa99fd-kube-api-access-d7dkj\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.415005 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-additional-scripts\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.415076 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run-ovn\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.415092 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-log-ovn\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.415120 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.515973 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run-ovn\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.516022 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-log-ovn\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.516061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.516085 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-scripts\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.516107 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dkj\" (UniqueName: \"kubernetes.io/projected/c50045b0-274e-48d3-a5db-64689efa99fd-kube-api-access-d7dkj\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.516185 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-additional-scripts\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.516611 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run-ovn\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.516626 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.516662 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-log-ovn\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.517117 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-additional-scripts\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.518220 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-scripts\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.540221 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dkj\" (UniqueName: \"kubernetes.io/projected/c50045b0-274e-48d3-a5db-64689efa99fd-kube-api-access-d7dkj\") pod \"ovn-controller-dz7f2-config-jvx9f\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.644048 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:30 crc kubenswrapper[4907]: I1009 19:45:30.666138 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a445-account-create-8pwq4"] Oct 09 19:45:30 crc kubenswrapper[4907]: W1009 19:45:30.749161 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a8729b0_9a3a_4ecf_8167_430807adc85a.slice/crio-9ac6b5496a29ee91d96cf297e35d0819287a37eb9ec10ffc006b320f7ec4097f WatchSource:0}: Error finding container 9ac6b5496a29ee91d96cf297e35d0819287a37eb9ec10ffc006b320f7ec4097f: Status 404 returned error can't find the container with id 9ac6b5496a29ee91d96cf297e35d0819287a37eb9ec10ffc006b320f7ec4097f Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.160926 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae9b1ff-2630-46e5-a242-4932ff7275b2" path="/var/lib/kubelet/pods/8ae9b1ff-2630-46e5-a242-4932ff7275b2/volumes" Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.174561 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.250447 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2-config-jvx9f"] Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.727048 4907 generic.go:334] "Generic (PLEG): container finished" podID="9a8729b0-9a3a-4ecf-8167-430807adc85a" containerID="af61657c7193fd70c5dce7e743e2ed58a790c5f7a97a1364df82fb9133bbdcd4" exitCode=0 Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.727127 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a445-account-create-8pwq4" event={"ID":"9a8729b0-9a3a-4ecf-8167-430807adc85a","Type":"ContainerDied","Data":"af61657c7193fd70c5dce7e743e2ed58a790c5f7a97a1364df82fb9133bbdcd4"} Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.727408 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a445-account-create-8pwq4" event={"ID":"9a8729b0-9a3a-4ecf-8167-430807adc85a","Type":"ContainerStarted","Data":"9ac6b5496a29ee91d96cf297e35d0819287a37eb9ec10ffc006b320f7ec4097f"} Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.729268 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-jvx9f" event={"ID":"c50045b0-274e-48d3-a5db-64689efa99fd","Type":"ContainerStarted","Data":"ece5cb46cee0a32a21fa734ec664b36a3e7b8bc7ff5a44e834eae94c1eb0cfdf"} Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.735828 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"fc8c5759023fe2b294702051643638875ad95f22ecf447a49ae2c8e7f3438b55"} Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.735879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"a20814291be5db853b2970188d9541ad239c5b474428277687a075b24a903180"} Oct 09 19:45:31 crc kubenswrapper[4907]: I1009 19:45:31.735893 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"03efcebf88515e7b9150410381f64c791a5b826c07e1993dc65e2889940b3461"} Oct 09 19:45:32 crc kubenswrapper[4907]: I1009 19:45:32.747794 4907 generic.go:334] "Generic (PLEG): container finished" podID="c50045b0-274e-48d3-a5db-64689efa99fd" containerID="d9379f8fa08d8ac3eec835d033c321b7835bb49c3641da5df4a678ca4cad124d" exitCode=0 Oct 09 19:45:32 crc kubenswrapper[4907]: I1009 19:45:32.747844 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-jvx9f" event={"ID":"c50045b0-274e-48d3-a5db-64689efa99fd","Type":"ContainerDied","Data":"d9379f8fa08d8ac3eec835d033c321b7835bb49c3641da5df4a678ca4cad124d"} Oct 09 19:45:32 crc kubenswrapper[4907]: I1009 19:45:32.759191 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"0358155e53f8095045bc5f5ae5b80b32d6c8cca93588d67f7cd5f5ee3e6a3248"} Oct 09 19:45:32 crc kubenswrapper[4907]: I1009 19:45:32.759239 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"f5897bfc3269e991e38372eba722db5d8e3eefed95b05680e0de290a19b9f7bf"} Oct 09 19:45:32 crc kubenswrapper[4907]: I1009 19:45:32.759255 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"3029a971752d950846e157e3180c5e9f3675ae36fd487c6f270100751ee22944"} Oct 09 19:45:32 crc kubenswrapper[4907]: I1009 19:45:32.759271 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fd0a266f-be5f-4162-87fb-7389f11c37ab","Type":"ContainerStarted","Data":"0f8395bc948c1247c5eb44495b2f4ba707b1a5f4d433e8d6170d356921f00b6b"} Oct 09 19:45:32 crc kubenswrapper[4907]: I1009 19:45:32.840379 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.825029248 podStartE2EDuration="26.840345769s" podCreationTimestamp="2025-10-09 19:45:06 +0000 UTC" firstStartedPulling="2025-10-09 19:45:23.793240728 +0000 UTC m=+1009.325208227" lastFinishedPulling="2025-10-09 19:45:30.808557269 +0000 UTC m=+1016.340524748" observedRunningTime="2025-10-09 19:45:32.815487944 +0000 UTC m=+1018.347455453" watchObservedRunningTime="2025-10-09 19:45:32.840345769 +0000 UTC m=+1018.372313298" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.109053 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-9tqbs"] Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.111084 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.112717 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a445-account-create-8pwq4" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.112998 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.127307 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-9tqbs"] Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.152992 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.153036 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.153070 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.153093 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-config\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.153110 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w5z4\" (UniqueName: \"kubernetes.io/projected/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-kube-api-access-5w5z4\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.153141 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.253881 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlrwm\" (UniqueName: \"kubernetes.io/projected/9a8729b0-9a3a-4ecf-8167-430807adc85a-kube-api-access-qlrwm\") pod \"9a8729b0-9a3a-4ecf-8167-430807adc85a\" (UID: \"9a8729b0-9a3a-4ecf-8167-430807adc85a\") " Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.254306 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.254342 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.254380 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.254407 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-config\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.254438 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w5z4\" (UniqueName: \"kubernetes.io/projected/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-kube-api-access-5w5z4\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.254481 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.255263 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.255283 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.255412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-config\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.255424 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.255491 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.262725 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a8729b0-9a3a-4ecf-8167-430807adc85a-kube-api-access-qlrwm" (OuterVolumeSpecName: "kube-api-access-qlrwm") pod "9a8729b0-9a3a-4ecf-8167-430807adc85a" (UID: "9a8729b0-9a3a-4ecf-8167-430807adc85a"). InnerVolumeSpecName "kube-api-access-qlrwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.271174 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w5z4\" (UniqueName: \"kubernetes.io/projected/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-kube-api-access-5w5z4\") pod \"dnsmasq-dns-77585f5f8c-9tqbs\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.356079 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlrwm\" (UniqueName: \"kubernetes.io/projected/9a8729b0-9a3a-4ecf-8167-430807adc85a-kube-api-access-qlrwm\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.466367 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.766482 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a445-account-create-8pwq4" event={"ID":"9a8729b0-9a3a-4ecf-8167-430807adc85a","Type":"ContainerDied","Data":"9ac6b5496a29ee91d96cf297e35d0819287a37eb9ec10ffc006b320f7ec4097f"} Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.766872 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac6b5496a29ee91d96cf297e35d0819287a37eb9ec10ffc006b320f7ec4097f" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.766606 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a445-account-create-8pwq4" Oct 09 19:45:33 crc kubenswrapper[4907]: I1009 19:45:33.923681 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-9tqbs"] Oct 09 19:45:33 crc kubenswrapper[4907]: W1009 19:45:33.943693 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d55cd3b_d20a_4307_a73d_f6f3fb16f715.slice/crio-ec939bc83616eb1fa1efb5349e487b6093fdede24f4928694c5e9c17ab3538b3 WatchSource:0}: Error finding container ec939bc83616eb1fa1efb5349e487b6093fdede24f4928694c5e9c17ab3538b3: Status 404 returned error can't find the container with id ec939bc83616eb1fa1efb5349e487b6093fdede24f4928694c5e9c17ab3538b3 Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.021924 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.170320 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run\") pod \"c50045b0-274e-48d3-a5db-64689efa99fd\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.170407 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run" (OuterVolumeSpecName: "var-run") pod "c50045b0-274e-48d3-a5db-64689efa99fd" (UID: "c50045b0-274e-48d3-a5db-64689efa99fd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.170493 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-additional-scripts\") pod \"c50045b0-274e-48d3-a5db-64689efa99fd\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.170771 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7dkj\" (UniqueName: \"kubernetes.io/projected/c50045b0-274e-48d3-a5db-64689efa99fd-kube-api-access-d7dkj\") pod \"c50045b0-274e-48d3-a5db-64689efa99fd\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.170830 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-log-ovn\") pod \"c50045b0-274e-48d3-a5db-64689efa99fd\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.170883 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-scripts\") pod \"c50045b0-274e-48d3-a5db-64689efa99fd\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.171001 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run-ovn\") pod \"c50045b0-274e-48d3-a5db-64689efa99fd\" (UID: \"c50045b0-274e-48d3-a5db-64689efa99fd\") " Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.170995 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c50045b0-274e-48d3-a5db-64689efa99fd" (UID: "c50045b0-274e-48d3-a5db-64689efa99fd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.171095 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c50045b0-274e-48d3-a5db-64689efa99fd" (UID: "c50045b0-274e-48d3-a5db-64689efa99fd"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.171096 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c50045b0-274e-48d3-a5db-64689efa99fd" (UID: "c50045b0-274e-48d3-a5db-64689efa99fd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.171736 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-scripts" (OuterVolumeSpecName: "scripts") pod "c50045b0-274e-48d3-a5db-64689efa99fd" (UID: "c50045b0-274e-48d3-a5db-64689efa99fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.171750 4907 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.171783 4907 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.171809 4907 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.171835 4907 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c50045b0-274e-48d3-a5db-64689efa99fd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.174613 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50045b0-274e-48d3-a5db-64689efa99fd-kube-api-access-d7dkj" (OuterVolumeSpecName: "kube-api-access-d7dkj") pod "c50045b0-274e-48d3-a5db-64689efa99fd" (UID: "c50045b0-274e-48d3-a5db-64689efa99fd"). InnerVolumeSpecName "kube-api-access-d7dkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.273902 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7dkj\" (UniqueName: \"kubernetes.io/projected/c50045b0-274e-48d3-a5db-64689efa99fd-kube-api-access-d7dkj\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.273957 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c50045b0-274e-48d3-a5db-64689efa99fd-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.776373 4907 generic.go:334] "Generic (PLEG): container finished" podID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" containerID="4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96" exitCode=0 Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.776541 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" event={"ID":"7d55cd3b-d20a-4307-a73d-f6f3fb16f715","Type":"ContainerDied","Data":"4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96"} Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.776613 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" event={"ID":"7d55cd3b-d20a-4307-a73d-f6f3fb16f715","Type":"ContainerStarted","Data":"ec939bc83616eb1fa1efb5349e487b6093fdede24f4928694c5e9c17ab3538b3"} Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.778725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-jvx9f" event={"ID":"c50045b0-274e-48d3-a5db-64689efa99fd","Type":"ContainerDied","Data":"ece5cb46cee0a32a21fa734ec664b36a3e7b8bc7ff5a44e834eae94c1eb0cfdf"} Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.778760 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ece5cb46cee0a32a21fa734ec664b36a3e7b8bc7ff5a44e834eae94c1eb0cfdf" Oct 09 19:45:34 crc kubenswrapper[4907]: I1009 19:45:34.778782 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-jvx9f" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.056327 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-64pxz"] Oct 09 19:45:35 crc kubenswrapper[4907]: E1009 19:45:35.057084 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50045b0-274e-48d3-a5db-64689efa99fd" containerName="ovn-config" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.057105 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50045b0-274e-48d3-a5db-64689efa99fd" containerName="ovn-config" Oct 09 19:45:35 crc kubenswrapper[4907]: E1009 19:45:35.057137 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8729b0-9a3a-4ecf-8167-430807adc85a" containerName="mariadb-account-create" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.057147 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8729b0-9a3a-4ecf-8167-430807adc85a" containerName="mariadb-account-create" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.057332 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a8729b0-9a3a-4ecf-8167-430807adc85a" containerName="mariadb-account-create" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.057365 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c50045b0-274e-48d3-a5db-64689efa99fd" containerName="ovn-config" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.058075 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.060586 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.061207 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sjjmv" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.075595 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-64pxz"] Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.115973 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dz7f2-config-jvx9f"] Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.122409 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dz7f2-config-jvx9f"] Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.187367 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50045b0-274e-48d3-a5db-64689efa99fd" path="/var/lib/kubelet/pods/c50045b0-274e-48d3-a5db-64689efa99fd/volumes" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.187629 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-config-data\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.187764 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-combined-ca-bundle\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.187847 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tpt2\" (UniqueName: \"kubernetes.io/projected/4638ed26-9fde-4eca-acc3-9d292f50e4bb-kube-api-access-6tpt2\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.187886 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-db-sync-config-data\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.194992 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dz7f2-config-6l5f8"] Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.202421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.205231 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.215720 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2-config-6l5f8"] Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.289858 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-combined-ca-bundle\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.289929 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tpt2\" (UniqueName: \"kubernetes.io/projected/4638ed26-9fde-4eca-acc3-9d292f50e4bb-kube-api-access-6tpt2\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.289966 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-db-sync-config-data\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.290037 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-config-data\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.293600 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.298732 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-combined-ca-bundle\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.303453 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-config-data\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.304894 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-db-sync-config-data\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.307053 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tpt2\" (UniqueName: \"kubernetes.io/projected/4638ed26-9fde-4eca-acc3-9d292f50e4bb-kube-api-access-6tpt2\") pod \"glance-db-sync-64pxz\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.376395 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sjjmv" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.385819 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.393363 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-log-ovn\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.393403 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-additional-scripts\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.393438 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run-ovn\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.393536 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-scripts\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.393579 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjzxx\" (UniqueName: \"kubernetes.io/projected/5614d577-d71c-498c-afa1-8cf56201e093-kube-api-access-hjzxx\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.393612 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.494990 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-log-ovn\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.495034 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-additional-scripts\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.495069 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run-ovn\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.495114 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-scripts\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.495155 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjzxx\" (UniqueName: \"kubernetes.io/projected/5614d577-d71c-498c-afa1-8cf56201e093-kube-api-access-hjzxx\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.495192 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.495643 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.495710 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-log-ovn\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.496520 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-additional-scripts\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.496583 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run-ovn\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.498657 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-scripts\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.517785 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjzxx\" (UniqueName: \"kubernetes.io/projected/5614d577-d71c-498c-afa1-8cf56201e093-kube-api-access-hjzxx\") pod \"ovn-controller-dz7f2-config-6l5f8\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.522635 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.790529 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" event={"ID":"7d55cd3b-d20a-4307-a73d-f6f3fb16f715","Type":"ContainerStarted","Data":"81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6"} Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.790764 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.808731 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" podStartSLOduration=2.808714215 podStartE2EDuration="2.808714215s" podCreationTimestamp="2025-10-09 19:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:45:35.807122248 +0000 UTC m=+1021.339089787" watchObservedRunningTime="2025-10-09 19:45:35.808714215 +0000 UTC m=+1021.340681704" Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.972742 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-64pxz"] Oct 09 19:45:35 crc kubenswrapper[4907]: W1009 19:45:35.972787 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4638ed26_9fde_4eca_acc3_9d292f50e4bb.slice/crio-0285f4a583118038c6182a480fcd4af073fe315b782bcf17bb1dfc7e80960c40 WatchSource:0}: Error finding container 0285f4a583118038c6182a480fcd4af073fe315b782bcf17bb1dfc7e80960c40: Status 404 returned error can't find the container with id 0285f4a583118038c6182a480fcd4af073fe315b782bcf17bb1dfc7e80960c40 Oct 09 19:45:35 crc kubenswrapper[4907]: I1009 19:45:35.997321 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2-config-6l5f8"] Oct 09 19:45:36 crc kubenswrapper[4907]: W1009 19:45:36.000608 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5614d577_d71c_498c_afa1_8cf56201e093.slice/crio-f91bf0b3cbe513fee34903f42cac7a1e458aae543b5f2c8f99188f9c2dfed0b1 WatchSource:0}: Error finding container f91bf0b3cbe513fee34903f42cac7a1e458aae543b5f2c8f99188f9c2dfed0b1: Status 404 returned error can't find the container with id f91bf0b3cbe513fee34903f42cac7a1e458aae543b5f2c8f99188f9c2dfed0b1 Oct 09 19:45:36 crc kubenswrapper[4907]: I1009 19:45:36.800935 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-64pxz" event={"ID":"4638ed26-9fde-4eca-acc3-9d292f50e4bb","Type":"ContainerStarted","Data":"0285f4a583118038c6182a480fcd4af073fe315b782bcf17bb1dfc7e80960c40"} Oct 09 19:45:36 crc kubenswrapper[4907]: I1009 19:45:36.803217 4907 generic.go:334] "Generic (PLEG): container finished" podID="5614d577-d71c-498c-afa1-8cf56201e093" containerID="d8787c13c73cb2c1a48b68c546c8a1e8bd101a32ebeaf3f4e781b61df9d21a10" exitCode=0 Oct 09 19:45:36 crc kubenswrapper[4907]: I1009 19:45:36.803281 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-6l5f8" event={"ID":"5614d577-d71c-498c-afa1-8cf56201e093","Type":"ContainerDied","Data":"d8787c13c73cb2c1a48b68c546c8a1e8bd101a32ebeaf3f4e781b61df9d21a10"} Oct 09 19:45:36 crc kubenswrapper[4907]: I1009 19:45:36.803385 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-6l5f8" event={"ID":"5614d577-d71c-498c-afa1-8cf56201e093","Type":"ContainerStarted","Data":"f91bf0b3cbe513fee34903f42cac7a1e458aae543b5f2c8f99188f9c2dfed0b1"} Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.170953 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337397 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-log-ovn\") pod \"5614d577-d71c-498c-afa1-8cf56201e093\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337497 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5614d577-d71c-498c-afa1-8cf56201e093" (UID: "5614d577-d71c-498c-afa1-8cf56201e093"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337512 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-scripts\") pod \"5614d577-d71c-498c-afa1-8cf56201e093\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337587 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjzxx\" (UniqueName: \"kubernetes.io/projected/5614d577-d71c-498c-afa1-8cf56201e093-kube-api-access-hjzxx\") pod \"5614d577-d71c-498c-afa1-8cf56201e093\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337687 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run-ovn\") pod \"5614d577-d71c-498c-afa1-8cf56201e093\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337732 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run\") pod \"5614d577-d71c-498c-afa1-8cf56201e093\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337763 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-additional-scripts\") pod \"5614d577-d71c-498c-afa1-8cf56201e093\" (UID: \"5614d577-d71c-498c-afa1-8cf56201e093\") " Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337774 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5614d577-d71c-498c-afa1-8cf56201e093" (UID: "5614d577-d71c-498c-afa1-8cf56201e093"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.337854 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run" (OuterVolumeSpecName: "var-run") pod "5614d577-d71c-498c-afa1-8cf56201e093" (UID: "5614d577-d71c-498c-afa1-8cf56201e093"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.338432 4907 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.338447 4907 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.338459 4907 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5614d577-d71c-498c-afa1-8cf56201e093-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.338785 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5614d577-d71c-498c-afa1-8cf56201e093" (UID: "5614d577-d71c-498c-afa1-8cf56201e093"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.340889 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-scripts" (OuterVolumeSpecName: "scripts") pod "5614d577-d71c-498c-afa1-8cf56201e093" (UID: "5614d577-d71c-498c-afa1-8cf56201e093"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.344919 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5614d577-d71c-498c-afa1-8cf56201e093-kube-api-access-hjzxx" (OuterVolumeSpecName: "kube-api-access-hjzxx") pod "5614d577-d71c-498c-afa1-8cf56201e093" (UID: "5614d577-d71c-498c-afa1-8cf56201e093"). InnerVolumeSpecName "kube-api-access-hjzxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.440157 4907 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.440194 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5614d577-d71c-498c-afa1-8cf56201e093-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.440208 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjzxx\" (UniqueName: \"kubernetes.io/projected/5614d577-d71c-498c-afa1-8cf56201e093-kube-api-access-hjzxx\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.838993 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-6l5f8" event={"ID":"5614d577-d71c-498c-afa1-8cf56201e093","Type":"ContainerDied","Data":"f91bf0b3cbe513fee34903f42cac7a1e458aae543b5f2c8f99188f9c2dfed0b1"} Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.839191 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f91bf0b3cbe513fee34903f42cac7a1e458aae543b5f2c8f99188f9c2dfed0b1" Oct 09 19:45:38 crc kubenswrapper[4907]: I1009 19:45:38.839076 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-6l5f8" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.258847 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dz7f2-config-6l5f8"] Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.269951 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dz7f2-config-6l5f8"] Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.391136 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dz7f2-config-fr26z"] Oct 09 19:45:39 crc kubenswrapper[4907]: E1009 19:45:39.391795 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5614d577-d71c-498c-afa1-8cf56201e093" containerName="ovn-config" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.391817 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5614d577-d71c-498c-afa1-8cf56201e093" containerName="ovn-config" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.392035 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5614d577-d71c-498c-afa1-8cf56201e093" containerName="ovn-config" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.392874 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.400021 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.402353 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2-config-fr26z"] Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.558698 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lltw8\" (UniqueName: \"kubernetes.io/projected/7f500019-e0f4-408a-b24a-6d0e145e4d47-kube-api-access-lltw8\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.558884 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run-ovn\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.558931 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-scripts\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.559153 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.559232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-additional-scripts\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.559304 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-log-ovn\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.660124 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.660180 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-additional-scripts\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.660203 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-log-ovn\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.660242 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lltw8\" (UniqueName: \"kubernetes.io/projected/7f500019-e0f4-408a-b24a-6d0e145e4d47-kube-api-access-lltw8\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.660292 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run-ovn\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.660310 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-scripts\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.662196 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-scripts\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.662429 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.662836 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-additional-scripts\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.662890 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-log-ovn\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.663135 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run-ovn\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.693216 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lltw8\" (UniqueName: \"kubernetes.io/projected/7f500019-e0f4-408a-b24a-6d0e145e4d47-kube-api-access-lltw8\") pod \"ovn-controller-dz7f2-config-fr26z\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:39 crc kubenswrapper[4907]: I1009 19:45:39.713071 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:40 crc kubenswrapper[4907]: I1009 19:45:40.265223 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dz7f2-config-fr26z"] Oct 09 19:45:40 crc kubenswrapper[4907]: W1009 19:45:40.271592 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f500019_e0f4_408a_b24a_6d0e145e4d47.slice/crio-ef09d261edc1d56b193b5d97fcf7904ed55d99f8e59dc4763e2547678233b0d0 WatchSource:0}: Error finding container ef09d261edc1d56b193b5d97fcf7904ed55d99f8e59dc4763e2547678233b0d0: Status 404 returned error can't find the container with id ef09d261edc1d56b193b5d97fcf7904ed55d99f8e59dc4763e2547678233b0d0 Oct 09 19:45:40 crc kubenswrapper[4907]: E1009 19:45:40.765705 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f500019_e0f4_408a_b24a_6d0e145e4d47.slice/crio-conmon-9a8068e2680dad27b8707e21478cc6ff7f3ba7ed24ad8f47e22cfd3d410c8e80.scope\": RecentStats: unable to find data in memory cache]" Oct 09 19:45:40 crc kubenswrapper[4907]: I1009 19:45:40.867128 4907 generic.go:334] "Generic (PLEG): container finished" podID="7f500019-e0f4-408a-b24a-6d0e145e4d47" containerID="9a8068e2680dad27b8707e21478cc6ff7f3ba7ed24ad8f47e22cfd3d410c8e80" exitCode=0 Oct 09 19:45:40 crc kubenswrapper[4907]: I1009 19:45:40.867188 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-fr26z" event={"ID":"7f500019-e0f4-408a-b24a-6d0e145e4d47","Type":"ContainerDied","Data":"9a8068e2680dad27b8707e21478cc6ff7f3ba7ed24ad8f47e22cfd3d410c8e80"} Oct 09 19:45:40 crc kubenswrapper[4907]: I1009 19:45:40.867289 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-fr26z" event={"ID":"7f500019-e0f4-408a-b24a-6d0e145e4d47","Type":"ContainerStarted","Data":"ef09d261edc1d56b193b5d97fcf7904ed55d99f8e59dc4763e2547678233b0d0"} Oct 09 19:45:40 crc kubenswrapper[4907]: I1009 19:45:40.888661 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.160904 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5614d577-d71c-498c-afa1-8cf56201e093" path="/var/lib/kubelet/pods/5614d577-d71c-498c-afa1-8cf56201e093/volumes" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.162193 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-m7fvd"] Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.163318 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m7fvd" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.171927 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-m7fvd"] Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.179658 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.276024 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7dcmv"] Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.288857 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntm92\" (UniqueName: \"kubernetes.io/projected/9ed98c56-4150-49f3-b183-440cd6fabc12-kube-api-access-ntm92\") pod \"cinder-db-create-m7fvd\" (UID: \"9ed98c56-4150-49f3-b183-440cd6fabc12\") " pod="openstack/cinder-db-create-m7fvd" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.295238 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7dcmv" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.295870 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7dcmv"] Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.390783 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntm92\" (UniqueName: \"kubernetes.io/projected/9ed98c56-4150-49f3-b183-440cd6fabc12-kube-api-access-ntm92\") pod \"cinder-db-create-m7fvd\" (UID: \"9ed98c56-4150-49f3-b183-440cd6fabc12\") " pod="openstack/cinder-db-create-m7fvd" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.390889 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zf74\" (UniqueName: \"kubernetes.io/projected/913f8dda-cdec-4099-9f66-8046eeab3371-kube-api-access-2zf74\") pod \"barbican-db-create-7dcmv\" (UID: \"913f8dda-cdec-4099-9f66-8046eeab3371\") " pod="openstack/barbican-db-create-7dcmv" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.410830 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntm92\" (UniqueName: \"kubernetes.io/projected/9ed98c56-4150-49f3-b183-440cd6fabc12-kube-api-access-ntm92\") pod \"cinder-db-create-m7fvd\" (UID: \"9ed98c56-4150-49f3-b183-440cd6fabc12\") " pod="openstack/cinder-db-create-m7fvd" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.465791 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zn9nz"] Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.467721 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zn9nz" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.486543 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zn9nz"] Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.493010 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zf74\" (UniqueName: \"kubernetes.io/projected/913f8dda-cdec-4099-9f66-8046eeab3371-kube-api-access-2zf74\") pod \"barbican-db-create-7dcmv\" (UID: \"913f8dda-cdec-4099-9f66-8046eeab3371\") " pod="openstack/barbican-db-create-7dcmv" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.497382 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m7fvd" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.516111 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-5tzcp"] Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.517519 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.521703 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.522016 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.522138 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.522378 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9tb58" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.525789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zf74\" (UniqueName: \"kubernetes.io/projected/913f8dda-cdec-4099-9f66-8046eeab3371-kube-api-access-2zf74\") pod \"barbican-db-create-7dcmv\" (UID: \"913f8dda-cdec-4099-9f66-8046eeab3371\") " pod="openstack/barbican-db-create-7dcmv" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.528257 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5tzcp"] Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.595331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vghh2\" (UniqueName: \"kubernetes.io/projected/9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3-kube-api-access-vghh2\") pod \"neutron-db-create-zn9nz\" (UID: \"9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3\") " pod="openstack/neutron-db-create-zn9nz" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.621871 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7dcmv" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.697135 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vghh2\" (UniqueName: \"kubernetes.io/projected/9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3-kube-api-access-vghh2\") pod \"neutron-db-create-zn9nz\" (UID: \"9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3\") " pod="openstack/neutron-db-create-zn9nz" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.697200 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-combined-ca-bundle\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.697233 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88sl\" (UniqueName: \"kubernetes.io/projected/5ed1546f-98a4-4b57-b79e-8defa04c38b8-kube-api-access-h88sl\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.697262 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-config-data\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.724060 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vghh2\" (UniqueName: \"kubernetes.io/projected/9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3-kube-api-access-vghh2\") pod \"neutron-db-create-zn9nz\" (UID: \"9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3\") " pod="openstack/neutron-db-create-zn9nz" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.784502 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zn9nz" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.799919 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-combined-ca-bundle\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.799993 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h88sl\" (UniqueName: \"kubernetes.io/projected/5ed1546f-98a4-4b57-b79e-8defa04c38b8-kube-api-access-h88sl\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.800034 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-config-data\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.803336 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-combined-ca-bundle\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.803347 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-config-data\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.817945 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h88sl\" (UniqueName: \"kubernetes.io/projected/5ed1546f-98a4-4b57-b79e-8defa04c38b8-kube-api-access-h88sl\") pod \"keystone-db-sync-5tzcp\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:41 crc kubenswrapper[4907]: I1009 19:45:41.872201 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:43 crc kubenswrapper[4907]: I1009 19:45:43.468691 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:45:43 crc kubenswrapper[4907]: I1009 19:45:43.532379 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc2mk"] Oct 09 19:45:43 crc kubenswrapper[4907]: I1009 19:45:43.532940 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gc2mk" podUID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerName="dnsmasq-dns" containerID="cri-o://4ee8eb790bbfb82cf9e7658d447906e04ac246d21dc646a627c9f8f1afb7807b" gracePeriod=10 Oct 09 19:45:43 crc kubenswrapper[4907]: I1009 19:45:43.902401 4907 generic.go:334] "Generic (PLEG): container finished" podID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerID="4ee8eb790bbfb82cf9e7658d447906e04ac246d21dc646a627c9f8f1afb7807b" exitCode=0 Oct 09 19:45:43 crc kubenswrapper[4907]: I1009 19:45:43.902500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc2mk" event={"ID":"c23c05c5-d7fc-4792-be12-5cfabce11bf7","Type":"ContainerDied","Data":"4ee8eb790bbfb82cf9e7658d447906e04ac246d21dc646a627c9f8f1afb7807b"} Oct 09 19:45:46 crc kubenswrapper[4907]: I1009 19:45:46.422239 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-gc2mk" podUID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.903493 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.950670 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run\") pod \"7f500019-e0f4-408a-b24a-6d0e145e4d47\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.950717 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-additional-scripts\") pod \"7f500019-e0f4-408a-b24a-6d0e145e4d47\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.950757 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-log-ovn\") pod \"7f500019-e0f4-408a-b24a-6d0e145e4d47\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.950791 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-scripts\") pod \"7f500019-e0f4-408a-b24a-6d0e145e4d47\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.950807 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run" (OuterVolumeSpecName: "var-run") pod "7f500019-e0f4-408a-b24a-6d0e145e4d47" (UID: "7f500019-e0f4-408a-b24a-6d0e145e4d47"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.950835 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lltw8\" (UniqueName: \"kubernetes.io/projected/7f500019-e0f4-408a-b24a-6d0e145e4d47-kube-api-access-lltw8\") pod \"7f500019-e0f4-408a-b24a-6d0e145e4d47\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.950958 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run-ovn\") pod \"7f500019-e0f4-408a-b24a-6d0e145e4d47\" (UID: \"7f500019-e0f4-408a-b24a-6d0e145e4d47\") " Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.951054 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7f500019-e0f4-408a-b24a-6d0e145e4d47" (UID: "7f500019-e0f4-408a-b24a-6d0e145e4d47"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.951186 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7f500019-e0f4-408a-b24a-6d0e145e4d47" (UID: "7f500019-e0f4-408a-b24a-6d0e145e4d47"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.951731 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7f500019-e0f4-408a-b24a-6d0e145e4d47" (UID: "7f500019-e0f4-408a-b24a-6d0e145e4d47"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.951804 4907 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.951828 4907 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.951844 4907 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f500019-e0f4-408a-b24a-6d0e145e4d47-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.952161 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-scripts" (OuterVolumeSpecName: "scripts") pod "7f500019-e0f4-408a-b24a-6d0e145e4d47" (UID: "7f500019-e0f4-408a-b24a-6d0e145e4d47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.966274 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f500019-e0f4-408a-b24a-6d0e145e4d47-kube-api-access-lltw8" (OuterVolumeSpecName: "kube-api-access-lltw8") pod "7f500019-e0f4-408a-b24a-6d0e145e4d47" (UID: "7f500019-e0f4-408a-b24a-6d0e145e4d47"). InnerVolumeSpecName "kube-api-access-lltw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.979663 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dz7f2-config-fr26z" event={"ID":"7f500019-e0f4-408a-b24a-6d0e145e4d47","Type":"ContainerDied","Data":"ef09d261edc1d56b193b5d97fcf7904ed55d99f8e59dc4763e2547678233b0d0"} Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.979709 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef09d261edc1d56b193b5d97fcf7904ed55d99f8e59dc4763e2547678233b0d0" Oct 09 19:45:48 crc kubenswrapper[4907]: I1009 19:45:48.979743 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dz7f2-config-fr26z" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.005700 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.052729 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlbwg\" (UniqueName: \"kubernetes.io/projected/c23c05c5-d7fc-4792-be12-5cfabce11bf7-kube-api-access-nlbwg\") pod \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.052798 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-nb\") pod \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.052920 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-config\") pod \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.052966 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-sb\") pod \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.053091 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-dns-svc\") pod \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\" (UID: \"c23c05c5-d7fc-4792-be12-5cfabce11bf7\") " Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.053520 4907 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.053545 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f500019-e0f4-408a-b24a-6d0e145e4d47-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.053558 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lltw8\" (UniqueName: \"kubernetes.io/projected/7f500019-e0f4-408a-b24a-6d0e145e4d47-kube-api-access-lltw8\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.058995 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23c05c5-d7fc-4792-be12-5cfabce11bf7-kube-api-access-nlbwg" (OuterVolumeSpecName: "kube-api-access-nlbwg") pod "c23c05c5-d7fc-4792-be12-5cfabce11bf7" (UID: "c23c05c5-d7fc-4792-be12-5cfabce11bf7"). InnerVolumeSpecName "kube-api-access-nlbwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.094654 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-config" (OuterVolumeSpecName: "config") pod "c23c05c5-d7fc-4792-be12-5cfabce11bf7" (UID: "c23c05c5-d7fc-4792-be12-5cfabce11bf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.101199 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c23c05c5-d7fc-4792-be12-5cfabce11bf7" (UID: "c23c05c5-d7fc-4792-be12-5cfabce11bf7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.105306 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c23c05c5-d7fc-4792-be12-5cfabce11bf7" (UID: "c23c05c5-d7fc-4792-be12-5cfabce11bf7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.113067 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c23c05c5-d7fc-4792-be12-5cfabce11bf7" (UID: "c23c05c5-d7fc-4792-be12-5cfabce11bf7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.154642 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.155610 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.155886 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlbwg\" (UniqueName: \"kubernetes.io/projected/c23c05c5-d7fc-4792-be12-5cfabce11bf7-kube-api-access-nlbwg\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.155951 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.156008 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c23c05c5-d7fc-4792-be12-5cfabce11bf7-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.250795 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7dcmv"] Oct 09 19:45:49 crc kubenswrapper[4907]: W1009 19:45:49.254119 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod913f8dda_cdec_4099_9f66_8046eeab3371.slice/crio-1d84af3330b15a075752928fb4a0de41d45eba71579b3b5f50203844b7d65106 WatchSource:0}: Error finding container 1d84af3330b15a075752928fb4a0de41d45eba71579b3b5f50203844b7d65106: Status 404 returned error can't find the container with id 1d84af3330b15a075752928fb4a0de41d45eba71579b3b5f50203844b7d65106 Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.325068 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-m7fvd"] Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.334142 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5tzcp"] Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.339955 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zn9nz"] Oct 09 19:45:49 crc kubenswrapper[4907]: W1009 19:45:49.348528 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ed98c56_4150_49f3_b183_440cd6fabc12.slice/crio-8f01c30d925e4ebca1010b3347e30428533f4a1ea572cbb645116debdfbaad19 WatchSource:0}: Error finding container 8f01c30d925e4ebca1010b3347e30428533f4a1ea572cbb645116debdfbaad19: Status 404 returned error can't find the container with id 8f01c30d925e4ebca1010b3347e30428533f4a1ea572cbb645116debdfbaad19 Oct 09 19:45:49 crc kubenswrapper[4907]: I1009 19:45:49.994770 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dz7f2-config-fr26z"] Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.015192 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dz7f2-config-fr26z"] Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.015975 4907 generic.go:334] "Generic (PLEG): container finished" podID="9ed98c56-4150-49f3-b183-440cd6fabc12" containerID="a7dd8c996028ff73a870a01b36ef0c572634580c194745dfc74f404cef0c720d" exitCode=0 Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.016030 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m7fvd" event={"ID":"9ed98c56-4150-49f3-b183-440cd6fabc12","Type":"ContainerDied","Data":"a7dd8c996028ff73a870a01b36ef0c572634580c194745dfc74f404cef0c720d"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.016086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m7fvd" event={"ID":"9ed98c56-4150-49f3-b183-440cd6fabc12","Type":"ContainerStarted","Data":"8f01c30d925e4ebca1010b3347e30428533f4a1ea572cbb645116debdfbaad19"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.020580 4907 generic.go:334] "Generic (PLEG): container finished" podID="9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3" containerID="3d8ccba5436a16d2b56612459b17edddd88d5c50b2e5ffc3573c9b65ed57bb49" exitCode=0 Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.020648 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zn9nz" event={"ID":"9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3","Type":"ContainerDied","Data":"3d8ccba5436a16d2b56612459b17edddd88d5c50b2e5ffc3573c9b65ed57bb49"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.020672 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zn9nz" event={"ID":"9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3","Type":"ContainerStarted","Data":"264e50e278b080b33212556d877a45bb1d0611791adad184a4f0413e2e3ef2cf"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.022263 4907 generic.go:334] "Generic (PLEG): container finished" podID="913f8dda-cdec-4099-9f66-8046eeab3371" containerID="3b774ec6b2281dff36595af1c8af3289f76892b0f6fccc9b4775aff5587fe039" exitCode=0 Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.022314 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7dcmv" event={"ID":"913f8dda-cdec-4099-9f66-8046eeab3371","Type":"ContainerDied","Data":"3b774ec6b2281dff36595af1c8af3289f76892b0f6fccc9b4775aff5587fe039"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.022333 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7dcmv" event={"ID":"913f8dda-cdec-4099-9f66-8046eeab3371","Type":"ContainerStarted","Data":"1d84af3330b15a075752928fb4a0de41d45eba71579b3b5f50203844b7d65106"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.023456 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5tzcp" event={"ID":"5ed1546f-98a4-4b57-b79e-8defa04c38b8","Type":"ContainerStarted","Data":"d3607d83c494c1ee3f231fb42af7345ae1fe04914c8ecb7b300a1d0623356a99"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.028524 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gc2mk" event={"ID":"c23c05c5-d7fc-4792-be12-5cfabce11bf7","Type":"ContainerDied","Data":"85089beccb6f892b52b44f1e7cc60ff231c911ae4c1a2c53d5ed095bd7cda3b5"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.028604 4907 scope.go:117] "RemoveContainer" containerID="4ee8eb790bbfb82cf9e7658d447906e04ac246d21dc646a627c9f8f1afb7807b" Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.031197 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gc2mk" Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.032941 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-64pxz" event={"ID":"4638ed26-9fde-4eca-acc3-9d292f50e4bb","Type":"ContainerStarted","Data":"ced21043db6f923ceeb9bb46fdae38b50a5d69c32520c2b93ac9f7bebfba109f"} Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.058783 4907 scope.go:117] "RemoveContainer" containerID="13c3fa1be888b2237dfe74705489ea3d9316c6bad630fc0092bd2ee2bd4c6a82" Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.076571 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-64pxz" podStartSLOduration=2.276925879 podStartE2EDuration="15.076548602s" podCreationTimestamp="2025-10-09 19:45:35 +0000 UTC" firstStartedPulling="2025-10-09 19:45:35.975005179 +0000 UTC m=+1021.506972658" lastFinishedPulling="2025-10-09 19:45:48.774627892 +0000 UTC m=+1034.306595381" observedRunningTime="2025-10-09 19:45:50.07303663 +0000 UTC m=+1035.605004119" watchObservedRunningTime="2025-10-09 19:45:50.076548602 +0000 UTC m=+1035.608516091" Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.092434 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc2mk"] Oct 09 19:45:50 crc kubenswrapper[4907]: I1009 19:45:50.099590 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gc2mk"] Oct 09 19:45:51 crc kubenswrapper[4907]: I1009 19:45:51.169437 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f500019-e0f4-408a-b24a-6d0e145e4d47" path="/var/lib/kubelet/pods/7f500019-e0f4-408a-b24a-6d0e145e4d47/volumes" Oct 09 19:45:51 crc kubenswrapper[4907]: I1009 19:45:51.170289 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" path="/var/lib/kubelet/pods/c23c05c5-d7fc-4792-be12-5cfabce11bf7/volumes" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.657348 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m7fvd" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.663669 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zn9nz" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.704561 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7dcmv" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.737137 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vghh2\" (UniqueName: \"kubernetes.io/projected/9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3-kube-api-access-vghh2\") pod \"9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3\" (UID: \"9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3\") " Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.737321 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntm92\" (UniqueName: \"kubernetes.io/projected/9ed98c56-4150-49f3-b183-440cd6fabc12-kube-api-access-ntm92\") pod \"9ed98c56-4150-49f3-b183-440cd6fabc12\" (UID: \"9ed98c56-4150-49f3-b183-440cd6fabc12\") " Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.737564 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zf74\" (UniqueName: \"kubernetes.io/projected/913f8dda-cdec-4099-9f66-8046eeab3371-kube-api-access-2zf74\") pod \"913f8dda-cdec-4099-9f66-8046eeab3371\" (UID: \"913f8dda-cdec-4099-9f66-8046eeab3371\") " Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.742823 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed98c56-4150-49f3-b183-440cd6fabc12-kube-api-access-ntm92" (OuterVolumeSpecName: "kube-api-access-ntm92") pod "9ed98c56-4150-49f3-b183-440cd6fabc12" (UID: "9ed98c56-4150-49f3-b183-440cd6fabc12"). InnerVolumeSpecName "kube-api-access-ntm92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.742890 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3-kube-api-access-vghh2" (OuterVolumeSpecName: "kube-api-access-vghh2") pod "9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3" (UID: "9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3"). InnerVolumeSpecName "kube-api-access-vghh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.743962 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913f8dda-cdec-4099-9f66-8046eeab3371-kube-api-access-2zf74" (OuterVolumeSpecName: "kube-api-access-2zf74") pod "913f8dda-cdec-4099-9f66-8046eeab3371" (UID: "913f8dda-cdec-4099-9f66-8046eeab3371"). InnerVolumeSpecName "kube-api-access-2zf74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.840667 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vghh2\" (UniqueName: \"kubernetes.io/projected/9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3-kube-api-access-vghh2\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.840697 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntm92\" (UniqueName: \"kubernetes.io/projected/9ed98c56-4150-49f3-b183-440cd6fabc12-kube-api-access-ntm92\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:53 crc kubenswrapper[4907]: I1009 19:45:53.840710 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zf74\" (UniqueName: \"kubernetes.io/projected/913f8dda-cdec-4099-9f66-8046eeab3371-kube-api-access-2zf74\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.071286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5tzcp" event={"ID":"5ed1546f-98a4-4b57-b79e-8defa04c38b8","Type":"ContainerStarted","Data":"3d2497550726bb028e816d349d5575927794a70e34ab605ac02a5b32dac95693"} Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.073873 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m7fvd" event={"ID":"9ed98c56-4150-49f3-b183-440cd6fabc12","Type":"ContainerDied","Data":"8f01c30d925e4ebca1010b3347e30428533f4a1ea572cbb645116debdfbaad19"} Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.073906 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f01c30d925e4ebca1010b3347e30428533f4a1ea572cbb645116debdfbaad19" Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.074074 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m7fvd" Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.083072 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zn9nz" event={"ID":"9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3","Type":"ContainerDied","Data":"264e50e278b080b33212556d877a45bb1d0611791adad184a4f0413e2e3ef2cf"} Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.083124 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zn9nz" Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.083127 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="264e50e278b080b33212556d877a45bb1d0611791adad184a4f0413e2e3ef2cf" Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.085877 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7dcmv" event={"ID":"913f8dda-cdec-4099-9f66-8046eeab3371","Type":"ContainerDied","Data":"1d84af3330b15a075752928fb4a0de41d45eba71579b3b5f50203844b7d65106"} Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.085910 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d84af3330b15a075752928fb4a0de41d45eba71579b3b5f50203844b7d65106" Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.085946 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7dcmv" Oct 09 19:45:54 crc kubenswrapper[4907]: I1009 19:45:54.087964 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-5tzcp" podStartSLOduration=8.907534229 podStartE2EDuration="13.087945211s" podCreationTimestamp="2025-10-09 19:45:41 +0000 UTC" firstStartedPulling="2025-10-09 19:45:49.328965233 +0000 UTC m=+1034.860932722" lastFinishedPulling="2025-10-09 19:45:53.509376185 +0000 UTC m=+1039.041343704" observedRunningTime="2025-10-09 19:45:54.084147628 +0000 UTC m=+1039.616115107" watchObservedRunningTime="2025-10-09 19:45:54.087945211 +0000 UTC m=+1039.619912700" Oct 09 19:45:55 crc kubenswrapper[4907]: I1009 19:45:55.106605 4907 generic.go:334] "Generic (PLEG): container finished" podID="4638ed26-9fde-4eca-acc3-9d292f50e4bb" containerID="ced21043db6f923ceeb9bb46fdae38b50a5d69c32520c2b93ac9f7bebfba109f" exitCode=0 Oct 09 19:45:55 crc kubenswrapper[4907]: I1009 19:45:55.106722 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-64pxz" event={"ID":"4638ed26-9fde-4eca-acc3-9d292f50e4bb","Type":"ContainerDied","Data":"ced21043db6f923ceeb9bb46fdae38b50a5d69c32520c2b93ac9f7bebfba109f"} Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.532724 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.588677 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-combined-ca-bundle\") pod \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.589159 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-config-data\") pod \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.589269 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tpt2\" (UniqueName: \"kubernetes.io/projected/4638ed26-9fde-4eca-acc3-9d292f50e4bb-kube-api-access-6tpt2\") pod \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.589311 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-db-sync-config-data\") pod \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\" (UID: \"4638ed26-9fde-4eca-acc3-9d292f50e4bb\") " Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.594522 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4638ed26-9fde-4eca-acc3-9d292f50e4bb" (UID: "4638ed26-9fde-4eca-acc3-9d292f50e4bb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.595319 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4638ed26-9fde-4eca-acc3-9d292f50e4bb-kube-api-access-6tpt2" (OuterVolumeSpecName: "kube-api-access-6tpt2") pod "4638ed26-9fde-4eca-acc3-9d292f50e4bb" (UID: "4638ed26-9fde-4eca-acc3-9d292f50e4bb"). InnerVolumeSpecName "kube-api-access-6tpt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.615434 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4638ed26-9fde-4eca-acc3-9d292f50e4bb" (UID: "4638ed26-9fde-4eca-acc3-9d292f50e4bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.634943 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-config-data" (OuterVolumeSpecName: "config-data") pod "4638ed26-9fde-4eca-acc3-9d292f50e4bb" (UID: "4638ed26-9fde-4eca-acc3-9d292f50e4bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.691614 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tpt2\" (UniqueName: \"kubernetes.io/projected/4638ed26-9fde-4eca-acc3-9d292f50e4bb-kube-api-access-6tpt2\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.691665 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.691684 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:56 crc kubenswrapper[4907]: I1009 19:45:56.691703 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4638ed26-9fde-4eca-acc3-9d292f50e4bb-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.130709 4907 generic.go:334] "Generic (PLEG): container finished" podID="5ed1546f-98a4-4b57-b79e-8defa04c38b8" containerID="3d2497550726bb028e816d349d5575927794a70e34ab605ac02a5b32dac95693" exitCode=0 Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.130801 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5tzcp" event={"ID":"5ed1546f-98a4-4b57-b79e-8defa04c38b8","Type":"ContainerDied","Data":"3d2497550726bb028e816d349d5575927794a70e34ab605ac02a5b32dac95693"} Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.133104 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-64pxz" event={"ID":"4638ed26-9fde-4eca-acc3-9d292f50e4bb","Type":"ContainerDied","Data":"0285f4a583118038c6182a480fcd4af073fe315b782bcf17bb1dfc7e80960c40"} Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.133153 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0285f4a583118038c6182a480fcd4af073fe315b782bcf17bb1dfc7e80960c40" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.135005 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-64pxz" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.621761 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-rbzl5"] Oct 09 19:45:57 crc kubenswrapper[4907]: E1009 19:45:57.622072 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f500019-e0f4-408a-b24a-6d0e145e4d47" containerName="ovn-config" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622083 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f500019-e0f4-408a-b24a-6d0e145e4d47" containerName="ovn-config" Oct 09 19:45:57 crc kubenswrapper[4907]: E1009 19:45:57.622095 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622101 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: E1009 19:45:57.622120 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerName="init" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622126 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerName="init" Oct 09 19:45:57 crc kubenswrapper[4907]: E1009 19:45:57.622138 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913f8dda-cdec-4099-9f66-8046eeab3371" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622143 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="913f8dda-cdec-4099-9f66-8046eeab3371" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: E1009 19:45:57.622154 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed98c56-4150-49f3-b183-440cd6fabc12" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622160 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed98c56-4150-49f3-b183-440cd6fabc12" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: E1009 19:45:57.622168 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4638ed26-9fde-4eca-acc3-9d292f50e4bb" containerName="glance-db-sync" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622175 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638ed26-9fde-4eca-acc3-9d292f50e4bb" containerName="glance-db-sync" Oct 09 19:45:57 crc kubenswrapper[4907]: E1009 19:45:57.622185 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerName="dnsmasq-dns" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622191 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerName="dnsmasq-dns" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622334 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="913f8dda-cdec-4099-9f66-8046eeab3371" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622344 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed98c56-4150-49f3-b183-440cd6fabc12" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622357 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f500019-e0f4-408a-b24a-6d0e145e4d47" containerName="ovn-config" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622370 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23c05c5-d7fc-4792-be12-5cfabce11bf7" containerName="dnsmasq-dns" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622379 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3" containerName="mariadb-database-create" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.622393 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4638ed26-9fde-4eca-acc3-9d292f50e4bb" containerName="glance-db-sync" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.623382 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.643812 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-rbzl5"] Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.709031 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.709094 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-config\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.709132 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.709165 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wnz5\" (UniqueName: \"kubernetes.io/projected/e7bef797-5e36-4bfd-9555-8fc0887067fa-kube-api-access-9wnz5\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.709193 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.709224 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.810708 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.811116 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-config\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.811181 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.811230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wnz5\" (UniqueName: \"kubernetes.io/projected/e7bef797-5e36-4bfd-9555-8fc0887067fa-kube-api-access-9wnz5\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.811273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.811313 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.811629 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.812008 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-config\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.812207 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.812266 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.812418 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.844516 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wnz5\" (UniqueName: \"kubernetes.io/projected/e7bef797-5e36-4bfd-9555-8fc0887067fa-kube-api-access-9wnz5\") pod \"dnsmasq-dns-7ff5475cc9-rbzl5\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:57 crc kubenswrapper[4907]: I1009 19:45:57.941136 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.431148 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-rbzl5"] Oct 09 19:45:58 crc kubenswrapper[4907]: W1009 19:45:58.434883 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7bef797_5e36_4bfd_9555_8fc0887067fa.slice/crio-2aa477a3dc7dae8fd32fb0e47a2eae01de3160c03bf647016bf3a2abadd6b6e3 WatchSource:0}: Error finding container 2aa477a3dc7dae8fd32fb0e47a2eae01de3160c03bf647016bf3a2abadd6b6e3: Status 404 returned error can't find the container with id 2aa477a3dc7dae8fd32fb0e47a2eae01de3160c03bf647016bf3a2abadd6b6e3 Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.455356 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.621773 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-combined-ca-bundle\") pod \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.622124 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h88sl\" (UniqueName: \"kubernetes.io/projected/5ed1546f-98a4-4b57-b79e-8defa04c38b8-kube-api-access-h88sl\") pod \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.622179 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-config-data\") pod \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\" (UID: \"5ed1546f-98a4-4b57-b79e-8defa04c38b8\") " Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.628121 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed1546f-98a4-4b57-b79e-8defa04c38b8-kube-api-access-h88sl" (OuterVolumeSpecName: "kube-api-access-h88sl") pod "5ed1546f-98a4-4b57-b79e-8defa04c38b8" (UID: "5ed1546f-98a4-4b57-b79e-8defa04c38b8"). InnerVolumeSpecName "kube-api-access-h88sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.645252 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ed1546f-98a4-4b57-b79e-8defa04c38b8" (UID: "5ed1546f-98a4-4b57-b79e-8defa04c38b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.672211 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-config-data" (OuterVolumeSpecName: "config-data") pod "5ed1546f-98a4-4b57-b79e-8defa04c38b8" (UID: "5ed1546f-98a4-4b57-b79e-8defa04c38b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.723277 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.723313 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h88sl\" (UniqueName: \"kubernetes.io/projected/5ed1546f-98a4-4b57-b79e-8defa04c38b8-kube-api-access-h88sl\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:58 crc kubenswrapper[4907]: I1009 19:45:58.723326 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ed1546f-98a4-4b57-b79e-8defa04c38b8-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.160013 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5tzcp" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.160272 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5tzcp" event={"ID":"5ed1546f-98a4-4b57-b79e-8defa04c38b8","Type":"ContainerDied","Data":"d3607d83c494c1ee3f231fb42af7345ae1fe04914c8ecb7b300a1d0623356a99"} Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.160301 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3607d83c494c1ee3f231fb42af7345ae1fe04914c8ecb7b300a1d0623356a99" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.161067 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" event={"ID":"e7bef797-5e36-4bfd-9555-8fc0887067fa","Type":"ContainerStarted","Data":"2aa477a3dc7dae8fd32fb0e47a2eae01de3160c03bf647016bf3a2abadd6b6e3"} Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.393766 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-rbzl5"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.415863 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-q5sjd"] Oct 09 19:45:59 crc kubenswrapper[4907]: E1009 19:45:59.416284 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed1546f-98a4-4b57-b79e-8defa04c38b8" containerName="keystone-db-sync" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.416307 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed1546f-98a4-4b57-b79e-8defa04c38b8" containerName="keystone-db-sync" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.416535 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed1546f-98a4-4b57-b79e-8defa04c38b8" containerName="keystone-db-sync" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.417214 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.426141 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.426176 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.426484 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9tb58" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.426658 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.428285 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q5sjd"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.504077 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.512993 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.518925 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.560858 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-fernet-keys\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.560909 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-credential-keys\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.560939 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-config-data\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.560960 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kff49\" (UniqueName: \"kubernetes.io/projected/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-kube-api-access-kff49\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.560979 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4dtc\" (UniqueName: \"kubernetes.io/projected/b2338e4d-2560-4e36-a309-4d46241e3c1b-kube-api-access-l4dtc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.561006 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.561023 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.561041 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.561094 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.561113 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.561138 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.561155 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-scripts\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.625400 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.627367 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.629802 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.630111 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.660563 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.662306 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kff49\" (UniqueName: \"kubernetes.io/projected/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-kube-api-access-kff49\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.662427 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4dtc\" (UniqueName: \"kubernetes.io/projected/b2338e4d-2560-4e36-a309-4d46241e3c1b-kube-api-access-l4dtc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.662527 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.662600 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.662664 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.662777 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-log-httpd\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.662872 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.662945 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663010 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663079 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv2dm\" (UniqueName: \"kubernetes.io/projected/3cf74d09-587e-410e-b450-e4d5206d4f55-kube-api-access-vv2dm\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663147 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663212 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-scripts\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663291 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-fernet-keys\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663376 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-config-data\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663436 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663526 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-credential-keys\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663603 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-scripts\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663670 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-run-httpd\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663743 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-config-data\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.664325 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.664913 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.665521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.666038 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.668878 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-scripts\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.663622 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.668881 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-credential-keys\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.669390 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-config-data\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.677716 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.678317 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-fernet-keys\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.695431 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4dtc\" (UniqueName: \"kubernetes.io/projected/b2338e4d-2560-4e36-a309-4d46241e3c1b-kube-api-access-l4dtc\") pod \"dnsmasq-dns-5c5cc7c5ff-4fp5m\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.698596 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kff49\" (UniqueName: \"kubernetes.io/projected/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-kube-api-access-kff49\") pod \"keystone-bootstrap-q5sjd\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.710259 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-5j4pr"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.711388 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.722061 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.722425 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k2h6f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.722597 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.723816 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.724571 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.733452 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5j4pr"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.757638 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.766576 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dw45f"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.768320 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.769588 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-config-data\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.769622 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.769662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-scripts\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.769684 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-run-httpd\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.769768 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-log-httpd\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.769791 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.769831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv2dm\" (UniqueName: \"kubernetes.io/projected/3cf74d09-587e-410e-b450-e4d5206d4f55-kube-api-access-vv2dm\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.770637 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-run-httpd\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.772608 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-log-httpd\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.777530 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.781423 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-config-data\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.785706 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.786884 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dw45f"] Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.791316 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-scripts\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.794649 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv2dm\" (UniqueName: \"kubernetes.io/projected/3cf74d09-587e-410e-b450-e4d5206d4f55-kube-api-access-vv2dm\") pod \"ceilometer-0\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.875290 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877301 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-combined-ca-bundle\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877413 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5skn\" (UniqueName: \"kubernetes.io/projected/39b99c09-f14d-462f-a4fa-e2555b429611-kube-api-access-m5skn\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877442 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-config\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877557 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-config-data\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877740 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-scripts\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877809 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877843 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877904 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.877938 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w7t9\" (UniqueName: \"kubernetes.io/projected/1fa02f8d-3656-4f83-8e33-01d053471999-kube-api-access-9w7t9\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.878014 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b99c09-f14d-462f-a4fa-e2555b429611-logs\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.948810 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994049 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994100 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-combined-ca-bundle\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994144 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5skn\" (UniqueName: \"kubernetes.io/projected/39b99c09-f14d-462f-a4fa-e2555b429611-kube-api-access-m5skn\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994169 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-config\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994191 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-config-data\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994245 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-scripts\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994323 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994362 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994395 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w7t9\" (UniqueName: \"kubernetes.io/projected/1fa02f8d-3656-4f83-8e33-01d053471999-kube-api-access-9w7t9\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.994447 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b99c09-f14d-462f-a4fa-e2555b429611-logs\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.995087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b99c09-f14d-462f-a4fa-e2555b429611-logs\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.996649 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.997234 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.997568 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.998378 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.998689 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-config\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:45:59 crc kubenswrapper[4907]: I1009 19:45:59.999130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-config-data\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.001990 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-combined-ca-bundle\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.002196 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-scripts\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.022806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5skn\" (UniqueName: \"kubernetes.io/projected/39b99c09-f14d-462f-a4fa-e2555b429611-kube-api-access-m5skn\") pod \"placement-db-sync-5j4pr\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " pod="openstack/placement-db-sync-5j4pr" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.030272 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w7t9\" (UniqueName: \"kubernetes.io/projected/1fa02f8d-3656-4f83-8e33-01d053471999-kube-api-access-9w7t9\") pod \"dnsmasq-dns-8b5c85b87-dw45f\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.182749 4907 generic.go:334] "Generic (PLEG): container finished" podID="e7bef797-5e36-4bfd-9555-8fc0887067fa" containerID="663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2" exitCode=0 Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.182845 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" event={"ID":"e7bef797-5e36-4bfd-9555-8fc0887067fa","Type":"ContainerDied","Data":"663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2"} Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.204935 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5j4pr" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.212319 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.364545 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-q5sjd"] Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.370433 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m"] Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.522691 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.524201 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.529879 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.530004 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sjjmv" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.531240 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.533991 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:46:00 crc kubenswrapper[4907]: W1009 19:46:00.542383 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cf74d09_587e_410e_b450_e4d5206d4f55.slice/crio-3aa8053d4c1c794976840941d3f8dd1df01300f4f30d921aa3b3aeb4ba0ab84e WatchSource:0}: Error finding container 3aa8053d4c1c794976840941d3f8dd1df01300f4f30d921aa3b3aeb4ba0ab84e: Status 404 returned error can't find the container with id 3aa8053d4c1c794976840941d3f8dd1df01300f4f30d921aa3b3aeb4ba0ab84e Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.564417 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.584567 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.586258 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.587790 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.590022 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.612530 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.612890 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ljqv\" (UniqueName: \"kubernetes.io/projected/4008b367-6ed3-4514-9658-9f6916dc0cd7-kube-api-access-9ljqv\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.612944 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.612980 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-logs\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613046 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613079 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613119 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613166 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613230 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-config-data\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613270 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613293 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-scripts\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613322 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwn9c\" (UniqueName: \"kubernetes.io/projected/acc60f91-13bd-4e6a-9f47-4feced60618a-kube-api-access-hwn9c\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613373 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.613408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715773 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-logs\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715840 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715859 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715892 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715913 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715947 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-config-data\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715968 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.715985 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-scripts\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.716003 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwn9c\" (UniqueName: \"kubernetes.io/projected/acc60f91-13bd-4e6a-9f47-4feced60618a-kube-api-access-hwn9c\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.716035 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.716058 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.716084 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.716109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ljqv\" (UniqueName: \"kubernetes.io/projected/4008b367-6ed3-4514-9658-9f6916dc0cd7-kube-api-access-9ljqv\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.716366 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.716718 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.717397 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.717726 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-logs\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.720037 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.720554 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.723715 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-scripts\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.724055 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.724611 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.726027 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.726242 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.726432 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-config-data\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.740481 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwn9c\" (UniqueName: \"kubernetes.io/projected/acc60f91-13bd-4e6a-9f47-4feced60618a-kube-api-access-hwn9c\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.740680 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ljqv\" (UniqueName: \"kubernetes.io/projected/4008b367-6ed3-4514-9658-9f6916dc0cd7-kube-api-access-9ljqv\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.776536 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.782429 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.787690 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5j4pr"] Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.847637 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.854109 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dw45f"] Oct 09 19:46:00 crc kubenswrapper[4907]: W1009 19:46:00.864218 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa02f8d_3656_4f83_8e33_01d053471999.slice/crio-4203a06969a31053ae160e096982493c7f647c71bddb9a683ca84b923dc27c33 WatchSource:0}: Error finding container 4203a06969a31053ae160e096982493c7f647c71bddb9a683ca84b923dc27c33: Status 404 returned error can't find the container with id 4203a06969a31053ae160e096982493c7f647c71bddb9a683ca84b923dc27c33 Oct 09 19:46:00 crc kubenswrapper[4907]: I1009 19:46:00.911962 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.202647 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5sjd" event={"ID":"ad0a99ee-a820-4314-b0fa-6c42953f8c9a","Type":"ContainerStarted","Data":"0b94a508daf88e3c8c116f75b2d90521643ab1c9e427223336f6964034206582"} Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.203219 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5sjd" event={"ID":"ad0a99ee-a820-4314-b0fa-6c42953f8c9a","Type":"ContainerStarted","Data":"6fd1c137843538be5bf2ea66eff045cbb4826016b8635d6ce8099f6d91cf97f9"} Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.206550 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" event={"ID":"1fa02f8d-3656-4f83-8e33-01d053471999","Type":"ContainerStarted","Data":"4203a06969a31053ae160e096982493c7f647c71bddb9a683ca84b923dc27c33"} Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.208426 4907 generic.go:334] "Generic (PLEG): container finished" podID="b2338e4d-2560-4e36-a309-4d46241e3c1b" containerID="3b2b0aac95dd9c5b03dfad1f638d0f3acc06557bb6d19cb04dd838dc4c951ed8" exitCode=0 Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.208497 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" event={"ID":"b2338e4d-2560-4e36-a309-4d46241e3c1b","Type":"ContainerDied","Data":"3b2b0aac95dd9c5b03dfad1f638d0f3acc06557bb6d19cb04dd838dc4c951ed8"} Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.208516 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" event={"ID":"b2338e4d-2560-4e36-a309-4d46241e3c1b","Type":"ContainerStarted","Data":"b40d288c40100cb7229c36eec7efe87632f63fdf3b848c2378dd3c7e55886e57"} Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.209299 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cf74d09-587e-410e-b450-e4d5206d4f55","Type":"ContainerStarted","Data":"3aa8053d4c1c794976840941d3f8dd1df01300f4f30d921aa3b3aeb4ba0ab84e"} Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.210114 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5j4pr" event={"ID":"39b99c09-f14d-462f-a4fa-e2555b429611","Type":"ContainerStarted","Data":"6dfac73e0292bfb61a516d4bc4d773c29cf1c35d47f3d89a3fb7a9bac9c4c8ca"} Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.221992 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" event={"ID":"e7bef797-5e36-4bfd-9555-8fc0887067fa","Type":"ContainerStarted","Data":"0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6"} Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.222152 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" podUID="e7bef797-5e36-4bfd-9555-8fc0887067fa" containerName="dnsmasq-dns" containerID="cri-o://0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6" gracePeriod=10 Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.222375 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.227523 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-q5sjd" podStartSLOduration=2.227504342 podStartE2EDuration="2.227504342s" podCreationTimestamp="2025-10-09 19:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:01.21806115 +0000 UTC m=+1046.750028649" watchObservedRunningTime="2025-10-09 19:46:01.227504342 +0000 UTC m=+1046.759471831" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.271505 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" podStartSLOduration=4.271483583 podStartE2EDuration="4.271483583s" podCreationTimestamp="2025-10-09 19:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:01.26037342 +0000 UTC m=+1046.792340929" watchObservedRunningTime="2025-10-09 19:46:01.271483583 +0000 UTC m=+1046.803451072" Oct 09 19:46:01 crc kubenswrapper[4907]: E1009 19:46:01.322524 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7bef797_5e36_4bfd_9555_8fc0887067fa.slice/crio-conmon-0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6.scope\": RecentStats: unable to find data in memory cache]" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.330104 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e8ad-account-create-65dcs"] Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.331897 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e8ad-account-create-65dcs" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.335335 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.345371 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e8ad-account-create-65dcs"] Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.430657 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64grm\" (UniqueName: \"kubernetes.io/projected/c1ff2dda-76aa-4595-bd0c-e69456106650-kube-api-access-64grm\") pod \"cinder-e8ad-account-create-65dcs\" (UID: \"c1ff2dda-76aa-4595-bd0c-e69456106650\") " pod="openstack/cinder-e8ad-account-create-65dcs" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.436543 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:01 crc kubenswrapper[4907]: W1009 19:46:01.444110 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacc60f91_13bd_4e6a_9f47_4feced60618a.slice/crio-221c5b32aaffe1cdf3ed89fdc7c3ef202a688225842728d9cc90e5e44da0e8c8 WatchSource:0}: Error finding container 221c5b32aaffe1cdf3ed89fdc7c3ef202a688225842728d9cc90e5e44da0e8c8: Status 404 returned error can't find the container with id 221c5b32aaffe1cdf3ed89fdc7c3ef202a688225842728d9cc90e5e44da0e8c8 Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.514036 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-aecc-account-create-9lqqk"] Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.515030 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aecc-account-create-9lqqk" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.520019 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.522241 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aecc-account-create-9lqqk"] Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.531654 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4cgb\" (UniqueName: \"kubernetes.io/projected/f8a28eaa-918c-4756-9936-1d724410617c-kube-api-access-h4cgb\") pod \"barbican-aecc-account-create-9lqqk\" (UID: \"f8a28eaa-918c-4756-9936-1d724410617c\") " pod="openstack/barbican-aecc-account-create-9lqqk" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.531848 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64grm\" (UniqueName: \"kubernetes.io/projected/c1ff2dda-76aa-4595-bd0c-e69456106650-kube-api-access-64grm\") pod \"cinder-e8ad-account-create-65dcs\" (UID: \"c1ff2dda-76aa-4595-bd0c-e69456106650\") " pod="openstack/cinder-e8ad-account-create-65dcs" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.560204 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64grm\" (UniqueName: \"kubernetes.io/projected/c1ff2dda-76aa-4595-bd0c-e69456106650-kube-api-access-64grm\") pod \"cinder-e8ad-account-create-65dcs\" (UID: \"c1ff2dda-76aa-4595-bd0c-e69456106650\") " pod="openstack/cinder-e8ad-account-create-65dcs" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.618586 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2d90-account-create-cwgs4"] Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.620058 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2d90-account-create-cwgs4" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.623562 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.633143 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4cgb\" (UniqueName: \"kubernetes.io/projected/f8a28eaa-918c-4756-9936-1d724410617c-kube-api-access-h4cgb\") pod \"barbican-aecc-account-create-9lqqk\" (UID: \"f8a28eaa-918c-4756-9936-1d724410617c\") " pod="openstack/barbican-aecc-account-create-9lqqk" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.642459 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2d90-account-create-cwgs4"] Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.658978 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4cgb\" (UniqueName: \"kubernetes.io/projected/f8a28eaa-918c-4756-9936-1d724410617c-kube-api-access-h4cgb\") pod \"barbican-aecc-account-create-9lqqk\" (UID: \"f8a28eaa-918c-4756-9936-1d724410617c\") " pod="openstack/barbican-aecc-account-create-9lqqk" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.674614 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.734789 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2gqs\" (UniqueName: \"kubernetes.io/projected/562823d8-4ea2-43cd-8e94-5c1c154de988-kube-api-access-n2gqs\") pod \"neutron-2d90-account-create-cwgs4\" (UID: \"562823d8-4ea2-43cd-8e94-5c1c154de988\") " pod="openstack/neutron-2d90-account-create-cwgs4" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.812524 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e8ad-account-create-65dcs" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.836204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2gqs\" (UniqueName: \"kubernetes.io/projected/562823d8-4ea2-43cd-8e94-5c1c154de988-kube-api-access-n2gqs\") pod \"neutron-2d90-account-create-cwgs4\" (UID: \"562823d8-4ea2-43cd-8e94-5c1c154de988\") " pod="openstack/neutron-2d90-account-create-cwgs4" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.839541 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aecc-account-create-9lqqk" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.854230 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.864764 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2gqs\" (UniqueName: \"kubernetes.io/projected/562823d8-4ea2-43cd-8e94-5c1c154de988-kube-api-access-n2gqs\") pod \"neutron-2d90-account-create-cwgs4\" (UID: \"562823d8-4ea2-43cd-8e94-5c1c154de988\") " pod="openstack/neutron-2d90-account-create-cwgs4" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.866320 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:46:01 crc kubenswrapper[4907]: I1009 19:46:01.975167 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2d90-account-create-cwgs4" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040108 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-nb\") pod \"e7bef797-5e36-4bfd-9555-8fc0887067fa\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040153 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4dtc\" (UniqueName: \"kubernetes.io/projected/b2338e4d-2560-4e36-a309-4d46241e3c1b-kube-api-access-l4dtc\") pod \"b2338e4d-2560-4e36-a309-4d46241e3c1b\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040177 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-swift-storage-0\") pod \"b2338e4d-2560-4e36-a309-4d46241e3c1b\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040204 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wnz5\" (UniqueName: \"kubernetes.io/projected/e7bef797-5e36-4bfd-9555-8fc0887067fa-kube-api-access-9wnz5\") pod \"e7bef797-5e36-4bfd-9555-8fc0887067fa\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040328 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-sb\") pod \"e7bef797-5e36-4bfd-9555-8fc0887067fa\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040357 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-svc\") pod \"e7bef797-5e36-4bfd-9555-8fc0887067fa\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040382 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-config\") pod \"e7bef797-5e36-4bfd-9555-8fc0887067fa\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040400 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-nb\") pod \"b2338e4d-2560-4e36-a309-4d46241e3c1b\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-sb\") pod \"b2338e4d-2560-4e36-a309-4d46241e3c1b\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040480 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-swift-storage-0\") pod \"e7bef797-5e36-4bfd-9555-8fc0887067fa\" (UID: \"e7bef797-5e36-4bfd-9555-8fc0887067fa\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040891 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-config\") pod \"b2338e4d-2560-4e36-a309-4d46241e3c1b\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.040923 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-svc\") pod \"b2338e4d-2560-4e36-a309-4d46241e3c1b\" (UID: \"b2338e4d-2560-4e36-a309-4d46241e3c1b\") " Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.096796 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2338e4d-2560-4e36-a309-4d46241e3c1b-kube-api-access-l4dtc" (OuterVolumeSpecName: "kube-api-access-l4dtc") pod "b2338e4d-2560-4e36-a309-4d46241e3c1b" (UID: "b2338e4d-2560-4e36-a309-4d46241e3c1b"). InnerVolumeSpecName "kube-api-access-l4dtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.117692 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bef797-5e36-4bfd-9555-8fc0887067fa-kube-api-access-9wnz5" (OuterVolumeSpecName: "kube-api-access-9wnz5") pod "e7bef797-5e36-4bfd-9555-8fc0887067fa" (UID: "e7bef797-5e36-4bfd-9555-8fc0887067fa"). InnerVolumeSpecName "kube-api-access-9wnz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.144140 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4dtc\" (UniqueName: \"kubernetes.io/projected/b2338e4d-2560-4e36-a309-4d46241e3c1b-kube-api-access-l4dtc\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.144174 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wnz5\" (UniqueName: \"kubernetes.io/projected/e7bef797-5e36-4bfd-9555-8fc0887067fa-kube-api-access-9wnz5\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.177739 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b2338e4d-2560-4e36-a309-4d46241e3c1b" (UID: "b2338e4d-2560-4e36-a309-4d46241e3c1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.201275 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-config" (OuterVolumeSpecName: "config") pod "b2338e4d-2560-4e36-a309-4d46241e3c1b" (UID: "b2338e4d-2560-4e36-a309-4d46241e3c1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.203581 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b2338e4d-2560-4e36-a309-4d46241e3c1b" (UID: "b2338e4d-2560-4e36-a309-4d46241e3c1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.221577 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b2338e4d-2560-4e36-a309-4d46241e3c1b" (UID: "b2338e4d-2560-4e36-a309-4d46241e3c1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.253526 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.253559 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.253572 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.253585 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.257557 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b2338e4d-2560-4e36-a309-4d46241e3c1b" (UID: "b2338e4d-2560-4e36-a309-4d46241e3c1b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.267371 4907 generic.go:334] "Generic (PLEG): container finished" podID="1fa02f8d-3656-4f83-8e33-01d053471999" containerID="2cf0bd7a46306bfaba9d50f007ae22c87fc1243f5071d763cbe337161004b3e6" exitCode=0 Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.267602 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" event={"ID":"1fa02f8d-3656-4f83-8e33-01d053471999","Type":"ContainerDied","Data":"2cf0bd7a46306bfaba9d50f007ae22c87fc1243f5071d763cbe337161004b3e6"} Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.274519 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.275127 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m" event={"ID":"b2338e4d-2560-4e36-a309-4d46241e3c1b","Type":"ContainerDied","Data":"b40d288c40100cb7229c36eec7efe87632f63fdf3b848c2378dd3c7e55886e57"} Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.275168 4907 scope.go:117] "RemoveContainer" containerID="3b2b0aac95dd9c5b03dfad1f638d0f3acc06557bb6d19cb04dd838dc4c951ed8" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.318817 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7bef797-5e36-4bfd-9555-8fc0887067fa" (UID: "e7bef797-5e36-4bfd-9555-8fc0887067fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.323899 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7bef797-5e36-4bfd-9555-8fc0887067fa" (UID: "e7bef797-5e36-4bfd-9555-8fc0887067fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.325393 4907 generic.go:334] "Generic (PLEG): container finished" podID="e7bef797-5e36-4bfd-9555-8fc0887067fa" containerID="0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6" exitCode=0 Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.325522 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" event={"ID":"e7bef797-5e36-4bfd-9555-8fc0887067fa","Type":"ContainerDied","Data":"0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6"} Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.325538 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.325555 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-rbzl5" event={"ID":"e7bef797-5e36-4bfd-9555-8fc0887067fa","Type":"ContainerDied","Data":"2aa477a3dc7dae8fd32fb0e47a2eae01de3160c03bf647016bf3a2abadd6b6e3"} Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.328261 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"acc60f91-13bd-4e6a-9f47-4feced60618a","Type":"ContainerStarted","Data":"221c5b32aaffe1cdf3ed89fdc7c3ef202a688225842728d9cc90e5e44da0e8c8"} Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.334908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4008b367-6ed3-4514-9658-9f6916dc0cd7","Type":"ContainerStarted","Data":"9be43d9c27c7cc7aaab7baeb7e353c731d96c36628a8c4bc5d3426cc23ab7de5"} Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.357831 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2338e4d-2560-4e36-a309-4d46241e3c1b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.357873 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.357887 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.374712 4907 scope.go:117] "RemoveContainer" containerID="0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.398803 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-config" (OuterVolumeSpecName: "config") pod "e7bef797-5e36-4bfd-9555-8fc0887067fa" (UID: "e7bef797-5e36-4bfd-9555-8fc0887067fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.399170 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7bef797-5e36-4bfd-9555-8fc0887067fa" (UID: "e7bef797-5e36-4bfd-9555-8fc0887067fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.399398 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e7bef797-5e36-4bfd-9555-8fc0887067fa" (UID: "e7bef797-5e36-4bfd-9555-8fc0887067fa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.401402 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m"] Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.417781 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-4fp5m"] Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.420672 4907 scope.go:117] "RemoveContainer" containerID="663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.460669 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.460699 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.460712 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7bef797-5e36-4bfd-9555-8fc0887067fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.481399 4907 scope.go:117] "RemoveContainer" containerID="0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6" Oct 09 19:46:02 crc kubenswrapper[4907]: E1009 19:46:02.482160 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6\": container with ID starting with 0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6 not found: ID does not exist" containerID="0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.482222 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6"} err="failed to get container status \"0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6\": rpc error: code = NotFound desc = could not find container \"0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6\": container with ID starting with 0f7c9e60f197e33b61f4f082aed8694bcf671a3ff432827c33aa6be109696bf6 not found: ID does not exist" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.482251 4907 scope.go:117] "RemoveContainer" containerID="663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2" Oct 09 19:46:02 crc kubenswrapper[4907]: E1009 19:46:02.482624 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2\": container with ID starting with 663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2 not found: ID does not exist" containerID="663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.482657 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2"} err="failed to get container status \"663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2\": rpc error: code = NotFound desc = could not find container \"663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2\": container with ID starting with 663278e4060fdc0182a890352a82aaf97389f27352ce7a6d0f2768225df47ac2 not found: ID does not exist" Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.535436 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aecc-account-create-9lqqk"] Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.545242 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e8ad-account-create-65dcs"] Oct 09 19:46:02 crc kubenswrapper[4907]: W1009 19:46:02.547234 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1ff2dda_76aa_4595_bd0c_e69456106650.slice/crio-e73d4478b859182067464aee8341dff6c247b5082730803aa20954528ab04d2e WatchSource:0}: Error finding container e73d4478b859182067464aee8341dff6c247b5082730803aa20954528ab04d2e: Status 404 returned error can't find the container with id e73d4478b859182067464aee8341dff6c247b5082730803aa20954528ab04d2e Oct 09 19:46:02 crc kubenswrapper[4907]: W1009 19:46:02.550502 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a28eaa_918c_4756_9936_1d724410617c.slice/crio-62c7ccc55db77217ff124348f67e51bde083c8cd25cdede57676fa72a1ccd1d9 WatchSource:0}: Error finding container 62c7ccc55db77217ff124348f67e51bde083c8cd25cdede57676fa72a1ccd1d9: Status 404 returned error can't find the container with id 62c7ccc55db77217ff124348f67e51bde083c8cd25cdede57676fa72a1ccd1d9 Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.669272 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-rbzl5"] Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.676925 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-rbzl5"] Oct 09 19:46:02 crc kubenswrapper[4907]: I1009 19:46:02.742912 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2d90-account-create-cwgs4"] Oct 09 19:46:02 crc kubenswrapper[4907]: W1009 19:46:02.764914 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod562823d8_4ea2_43cd_8e94_5c1c154de988.slice/crio-daab7ff9dcac4497ea10500b8094112e85283e025e7a734cfda4df3b3ebc2f86 WatchSource:0}: Error finding container daab7ff9dcac4497ea10500b8094112e85283e025e7a734cfda4df3b3ebc2f86: Status 404 returned error can't find the container with id daab7ff9dcac4497ea10500b8094112e85283e025e7a734cfda4df3b3ebc2f86 Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.164842 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2338e4d-2560-4e36-a309-4d46241e3c1b" path="/var/lib/kubelet/pods/b2338e4d-2560-4e36-a309-4d46241e3c1b/volumes" Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.166459 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bef797-5e36-4bfd-9555-8fc0887067fa" path="/var/lib/kubelet/pods/e7bef797-5e36-4bfd-9555-8fc0887067fa/volumes" Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.346737 4907 generic.go:334] "Generic (PLEG): container finished" podID="c1ff2dda-76aa-4595-bd0c-e69456106650" containerID="16914d10778ef5739d4802a16fedae69c8c022532bc0f32a8b69e2fc376e7d85" exitCode=0 Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.347405 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e8ad-account-create-65dcs" event={"ID":"c1ff2dda-76aa-4595-bd0c-e69456106650","Type":"ContainerDied","Data":"16914d10778ef5739d4802a16fedae69c8c022532bc0f32a8b69e2fc376e7d85"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.347441 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e8ad-account-create-65dcs" event={"ID":"c1ff2dda-76aa-4595-bd0c-e69456106650","Type":"ContainerStarted","Data":"e73d4478b859182067464aee8341dff6c247b5082730803aa20954528ab04d2e"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.352920 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" event={"ID":"1fa02f8d-3656-4f83-8e33-01d053471999","Type":"ContainerStarted","Data":"be07f0262303804bfbc1cb8ea506aee01e615e12e1b136f278186de95fee87fa"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.353801 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.366213 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"acc60f91-13bd-4e6a-9f47-4feced60618a","Type":"ContainerStarted","Data":"46c8f79491335d3b20733f1d0337dab83da46e25468c314d7518c2723fa5530c"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.370911 4907 generic.go:334] "Generic (PLEG): container finished" podID="f8a28eaa-918c-4756-9936-1d724410617c" containerID="760887aa94cf5066bb78783603b9abf2f7ed6cbd438ed798ca3984f1d9180a14" exitCode=0 Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.370967 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aecc-account-create-9lqqk" event={"ID":"f8a28eaa-918c-4756-9936-1d724410617c","Type":"ContainerDied","Data":"760887aa94cf5066bb78783603b9abf2f7ed6cbd438ed798ca3984f1d9180a14"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.370991 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aecc-account-create-9lqqk" event={"ID":"f8a28eaa-918c-4756-9936-1d724410617c","Type":"ContainerStarted","Data":"62c7ccc55db77217ff124348f67e51bde083c8cd25cdede57676fa72a1ccd1d9"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.374176 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68c6474976-4g4dt_c153b2ce-efcd-4d73-8b4f-8e67322e88c5/catalog-operator/0.log" Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.374232 4907 generic.go:334] "Generic (PLEG): container finished" podID="c153b2ce-efcd-4d73-8b4f-8e67322e88c5" containerID="ca7879c2b6a286bb69f313f0476f97ec62059e250cc32609ce16a08537d96fbf" exitCode=2 Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.374323 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" event={"ID":"c153b2ce-efcd-4d73-8b4f-8e67322e88c5","Type":"ContainerDied","Data":"ca7879c2b6a286bb69f313f0476f97ec62059e250cc32609ce16a08537d96fbf"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.375084 4907 scope.go:117] "RemoveContainer" containerID="ca7879c2b6a286bb69f313f0476f97ec62059e250cc32609ce16a08537d96fbf" Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.376729 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2d90-account-create-cwgs4" event={"ID":"562823d8-4ea2-43cd-8e94-5c1c154de988","Type":"ContainerStarted","Data":"daab7ff9dcac4497ea10500b8094112e85283e025e7a734cfda4df3b3ebc2f86"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.382562 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" podStartSLOduration=4.382540735 podStartE2EDuration="4.382540735s" podCreationTimestamp="2025-10-09 19:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:03.381858379 +0000 UTC m=+1048.913825888" watchObservedRunningTime="2025-10-09 19:46:03.382540735 +0000 UTC m=+1048.914508224" Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.388318 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4008b367-6ed3-4514-9658-9f6916dc0cd7","Type":"ContainerStarted","Data":"fd382398ab923035d451db18993d649286536c82b3e1f828175bd0ff3d13831e"} Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.532332 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.588843 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:03 crc kubenswrapper[4907]: I1009 19:46:03.610452 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.401002 4907 generic.go:334] "Generic (PLEG): container finished" podID="562823d8-4ea2-43cd-8e94-5c1c154de988" containerID="78b7cecd01ea23f7eb702bcc8518136d4a6050ced256d01f68b1eb33bd68e93c" exitCode=0 Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.401059 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2d90-account-create-cwgs4" event={"ID":"562823d8-4ea2-43cd-8e94-5c1c154de988","Type":"ContainerDied","Data":"78b7cecd01ea23f7eb702bcc8518136d4a6050ced256d01f68b1eb33bd68e93c"} Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.410798 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"acc60f91-13bd-4e6a-9f47-4feced60618a","Type":"ContainerStarted","Data":"bd0b5bd89c2825ba38fea7b50ba7dea3faa04d1aca19b1f3dd361e34161287c3"} Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.420221 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4008b367-6ed3-4514-9658-9f6916dc0cd7","Type":"ContainerStarted","Data":"1694b4ceea53e0ed1479459e84ce373780d035eb33f0912695d0126e87614ffb"} Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.420338 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerName="glance-log" containerID="cri-o://fd382398ab923035d451db18993d649286536c82b3e1f828175bd0ff3d13831e" gracePeriod=30 Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.420439 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerName="glance-httpd" containerID="cri-o://1694b4ceea53e0ed1479459e84ce373780d035eb33f0912695d0126e87614ffb" gracePeriod=30 Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.428547 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68c6474976-4g4dt_c153b2ce-efcd-4d73-8b4f-8e67322e88c5/catalog-operator/0.log" Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.428798 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" event={"ID":"c153b2ce-efcd-4d73-8b4f-8e67322e88c5","Type":"ContainerStarted","Data":"4c92a1763c9b5a3812608516c9260f354989f8beeb95abc72cff4c1f48b5ab2a"} Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.429347 4907 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" containerID="cri-o://ca7879c2b6a286bb69f313f0476f97ec62059e250cc32609ce16a08537d96fbf" Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.429381 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.447114 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.447089182 podStartE2EDuration="5.447089182s" podCreationTimestamp="2025-10-09 19:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:04.436122723 +0000 UTC m=+1049.968090232" watchObservedRunningTime="2025-10-09 19:46:04.447089182 +0000 UTC m=+1049.979056671" Oct 09 19:46:04 crc kubenswrapper[4907]: I1009 19:46:04.468714 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.468695443 podStartE2EDuration="5.468695443s" podCreationTimestamp="2025-10-09 19:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:04.46775554 +0000 UTC m=+1049.999723039" watchObservedRunningTime="2025-10-09 19:46:04.468695443 +0000 UTC m=+1050.000662932" Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.451383 4907 generic.go:334] "Generic (PLEG): container finished" podID="ad0a99ee-a820-4314-b0fa-6c42953f8c9a" containerID="0b94a508daf88e3c8c116f75b2d90521643ab1c9e427223336f6964034206582" exitCode=0 Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.451500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5sjd" event={"ID":"ad0a99ee-a820-4314-b0fa-6c42953f8c9a","Type":"ContainerDied","Data":"0b94a508daf88e3c8c116f75b2d90521643ab1c9e427223336f6964034206582"} Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.454164 4907 generic.go:334] "Generic (PLEG): container finished" podID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerID="1694b4ceea53e0ed1479459e84ce373780d035eb33f0912695d0126e87614ffb" exitCode=0 Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.454182 4907 generic.go:334] "Generic (PLEG): container finished" podID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerID="fd382398ab923035d451db18993d649286536c82b3e1f828175bd0ff3d13831e" exitCode=143 Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.455148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4008b367-6ed3-4514-9658-9f6916dc0cd7","Type":"ContainerDied","Data":"1694b4ceea53e0ed1479459e84ce373780d035eb33f0912695d0126e87614ffb"} Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.455189 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.455206 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4008b367-6ed3-4514-9658-9f6916dc0cd7","Type":"ContainerDied","Data":"fd382398ab923035d451db18993d649286536c82b3e1f828175bd0ff3d13831e"} Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.455439 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerName="glance-log" containerID="cri-o://46c8f79491335d3b20733f1d0337dab83da46e25468c314d7518c2723fa5530c" gracePeriod=30 Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.455555 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerName="glance-httpd" containerID="cri-o://bd0b5bd89c2825ba38fea7b50ba7dea3faa04d1aca19b1f3dd361e34161287c3" gracePeriod=30 Oct 09 19:46:05 crc kubenswrapper[4907]: I1009 19:46:05.458698 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4g4dt" Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.464915 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e8ad-account-create-65dcs" event={"ID":"c1ff2dda-76aa-4595-bd0c-e69456106650","Type":"ContainerDied","Data":"e73d4478b859182067464aee8341dff6c247b5082730803aa20954528ab04d2e"} Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.465173 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73d4478b859182067464aee8341dff6c247b5082730803aa20954528ab04d2e" Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.467645 4907 generic.go:334] "Generic (PLEG): container finished" podID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerID="bd0b5bd89c2825ba38fea7b50ba7dea3faa04d1aca19b1f3dd361e34161287c3" exitCode=0 Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.467718 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"acc60f91-13bd-4e6a-9f47-4feced60618a","Type":"ContainerDied","Data":"bd0b5bd89c2825ba38fea7b50ba7dea3faa04d1aca19b1f3dd361e34161287c3"} Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.467748 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"acc60f91-13bd-4e6a-9f47-4feced60618a","Type":"ContainerDied","Data":"46c8f79491335d3b20733f1d0337dab83da46e25468c314d7518c2723fa5530c"} Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.467699 4907 generic.go:334] "Generic (PLEG): container finished" podID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerID="46c8f79491335d3b20733f1d0337dab83da46e25468c314d7518c2723fa5530c" exitCode=143 Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.520300 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e8ad-account-create-65dcs" Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.696191 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64grm\" (UniqueName: \"kubernetes.io/projected/c1ff2dda-76aa-4595-bd0c-e69456106650-kube-api-access-64grm\") pod \"c1ff2dda-76aa-4595-bd0c-e69456106650\" (UID: \"c1ff2dda-76aa-4595-bd0c-e69456106650\") " Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.702551 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ff2dda-76aa-4595-bd0c-e69456106650-kube-api-access-64grm" (OuterVolumeSpecName: "kube-api-access-64grm") pod "c1ff2dda-76aa-4595-bd0c-e69456106650" (UID: "c1ff2dda-76aa-4595-bd0c-e69456106650"). InnerVolumeSpecName "kube-api-access-64grm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:06 crc kubenswrapper[4907]: I1009 19:46:06.798714 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64grm\" (UniqueName: \"kubernetes.io/projected/c1ff2dda-76aa-4595-bd0c-e69456106650-kube-api-access-64grm\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:07 crc kubenswrapper[4907]: I1009 19:46:07.476696 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e8ad-account-create-65dcs" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.372032 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aecc-account-create-9lqqk" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.380988 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2d90-account-create-cwgs4" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.410720 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.446626 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.450646 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4cgb\" (UniqueName: \"kubernetes.io/projected/f8a28eaa-918c-4756-9936-1d724410617c-kube-api-access-h4cgb\") pod \"f8a28eaa-918c-4756-9936-1d724410617c\" (UID: \"f8a28eaa-918c-4756-9936-1d724410617c\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.457674 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a28eaa-918c-4756-9936-1d724410617c-kube-api-access-h4cgb" (OuterVolumeSpecName: "kube-api-access-h4cgb") pod "f8a28eaa-918c-4756-9936-1d724410617c" (UID: "f8a28eaa-918c-4756-9936-1d724410617c"). InnerVolumeSpecName "kube-api-access-h4cgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.500935 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2d90-account-create-cwgs4" event={"ID":"562823d8-4ea2-43cd-8e94-5c1c154de988","Type":"ContainerDied","Data":"daab7ff9dcac4497ea10500b8094112e85283e025e7a734cfda4df3b3ebc2f86"} Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.501007 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daab7ff9dcac4497ea10500b8094112e85283e025e7a734cfda4df3b3ebc2f86" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.501063 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2d90-account-create-cwgs4" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.505326 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-q5sjd" event={"ID":"ad0a99ee-a820-4314-b0fa-6c42953f8c9a","Type":"ContainerDied","Data":"6fd1c137843538be5bf2ea66eff045cbb4826016b8635d6ce8099f6d91cf97f9"} Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.505359 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd1c137843538be5bf2ea66eff045cbb4826016b8635d6ce8099f6d91cf97f9" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.505405 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-q5sjd" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.510093 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4008b367-6ed3-4514-9658-9f6916dc0cd7","Type":"ContainerDied","Data":"9be43d9c27c7cc7aaab7baeb7e353c731d96c36628a8c4bc5d3426cc23ab7de5"} Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.510136 4907 scope.go:117] "RemoveContainer" containerID="1694b4ceea53e0ed1479459e84ce373780d035eb33f0912695d0126e87614ffb" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.510258 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.514715 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aecc-account-create-9lqqk" event={"ID":"f8a28eaa-918c-4756-9936-1d724410617c","Type":"ContainerDied","Data":"62c7ccc55db77217ff124348f67e51bde083c8cd25cdede57676fa72a1ccd1d9"} Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.514752 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62c7ccc55db77217ff124348f67e51bde083c8cd25cdede57676fa72a1ccd1d9" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.514778 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aecc-account-create-9lqqk" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552249 4907 scope.go:117] "RemoveContainer" containerID="fd382398ab923035d451db18993d649286536c82b3e1f828175bd0ff3d13831e" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552535 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-httpd-run\") pod \"4008b367-6ed3-4514-9658-9f6916dc0cd7\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552568 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-logs\") pod \"4008b367-6ed3-4514-9658-9f6916dc0cd7\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552628 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kff49\" (UniqueName: \"kubernetes.io/projected/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-kube-api-access-kff49\") pod \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552655 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4008b367-6ed3-4514-9658-9f6916dc0cd7\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552673 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-credential-keys\") pod \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552693 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2gqs\" (UniqueName: \"kubernetes.io/projected/562823d8-4ea2-43cd-8e94-5c1c154de988-kube-api-access-n2gqs\") pod \"562823d8-4ea2-43cd-8e94-5c1c154de988\" (UID: \"562823d8-4ea2-43cd-8e94-5c1c154de988\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552740 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle\") pod \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552757 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-combined-ca-bundle\") pod \"4008b367-6ed3-4514-9658-9f6916dc0cd7\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552809 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ljqv\" (UniqueName: \"kubernetes.io/projected/4008b367-6ed3-4514-9658-9f6916dc0cd7-kube-api-access-9ljqv\") pod \"4008b367-6ed3-4514-9658-9f6916dc0cd7\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552829 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-fernet-keys\") pod \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552847 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-config-data\") pod \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552864 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-config-data\") pod \"4008b367-6ed3-4514-9658-9f6916dc0cd7\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552902 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-scripts\") pod \"4008b367-6ed3-4514-9658-9f6916dc0cd7\" (UID: \"4008b367-6ed3-4514-9658-9f6916dc0cd7\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.552919 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-scripts\") pod \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.553213 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4cgb\" (UniqueName: \"kubernetes.io/projected/f8a28eaa-918c-4756-9936-1d724410617c-kube-api-access-h4cgb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.556329 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4008b367-6ed3-4514-9658-9f6916dc0cd7" (UID: "4008b367-6ed3-4514-9658-9f6916dc0cd7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.556356 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-logs" (OuterVolumeSpecName: "logs") pod "4008b367-6ed3-4514-9658-9f6916dc0cd7" (UID: "4008b367-6ed3-4514-9658-9f6916dc0cd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.556689 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562823d8-4ea2-43cd-8e94-5c1c154de988-kube-api-access-n2gqs" (OuterVolumeSpecName: "kube-api-access-n2gqs") pod "562823d8-4ea2-43cd-8e94-5c1c154de988" (UID: "562823d8-4ea2-43cd-8e94-5c1c154de988"). InnerVolumeSpecName "kube-api-access-n2gqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.561832 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-scripts" (OuterVolumeSpecName: "scripts") pod "ad0a99ee-a820-4314-b0fa-6c42953f8c9a" (UID: "ad0a99ee-a820-4314-b0fa-6c42953f8c9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.562194 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-kube-api-access-kff49" (OuterVolumeSpecName: "kube-api-access-kff49") pod "ad0a99ee-a820-4314-b0fa-6c42953f8c9a" (UID: "ad0a99ee-a820-4314-b0fa-6c42953f8c9a"). InnerVolumeSpecName "kube-api-access-kff49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.563019 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ad0a99ee-a820-4314-b0fa-6c42953f8c9a" (UID: "ad0a99ee-a820-4314-b0fa-6c42953f8c9a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.567558 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ad0a99ee-a820-4314-b0fa-6c42953f8c9a" (UID: "ad0a99ee-a820-4314-b0fa-6c42953f8c9a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.568402 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "4008b367-6ed3-4514-9658-9f6916dc0cd7" (UID: "4008b367-6ed3-4514-9658-9f6916dc0cd7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.571754 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-scripts" (OuterVolumeSpecName: "scripts") pod "4008b367-6ed3-4514-9658-9f6916dc0cd7" (UID: "4008b367-6ed3-4514-9658-9f6916dc0cd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.572149 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4008b367-6ed3-4514-9658-9f6916dc0cd7-kube-api-access-9ljqv" (OuterVolumeSpecName: "kube-api-access-9ljqv") pod "4008b367-6ed3-4514-9658-9f6916dc0cd7" (UID: "4008b367-6ed3-4514-9658-9f6916dc0cd7"). InnerVolumeSpecName "kube-api-access-9ljqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.602007 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle podName:ad0a99ee-a820-4314-b0fa-6c42953f8c9a nodeName:}" failed. No retries permitted until 2025-10-09 19:46:10.101978487 +0000 UTC m=+1055.633945976 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle") pod "ad0a99ee-a820-4314-b0fa-6c42953f8c9a" (UID: "ad0a99ee-a820-4314-b0fa-6c42953f8c9a") : error deleting /var/lib/kubelet/pods/ad0a99ee-a820-4314-b0fa-6c42953f8c9a/volume-subpaths: remove /var/lib/kubelet/pods/ad0a99ee-a820-4314-b0fa-6c42953f8c9a/volume-subpaths: no such file or directory Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.605720 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4008b367-6ed3-4514-9658-9f6916dc0cd7" (UID: "4008b367-6ed3-4514-9658-9f6916dc0cd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.610087 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-config-data" (OuterVolumeSpecName: "config-data") pod "ad0a99ee-a820-4314-b0fa-6c42953f8c9a" (UID: "ad0a99ee-a820-4314-b0fa-6c42953f8c9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.646006 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-config-data" (OuterVolumeSpecName: "config-data") pod "4008b367-6ed3-4514-9658-9f6916dc0cd7" (UID: "4008b367-6ed3-4514-9658-9f6916dc0cd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654565 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2gqs\" (UniqueName: \"kubernetes.io/projected/562823d8-4ea2-43cd-8e94-5c1c154de988-kube-api-access-n2gqs\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654596 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654610 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ljqv\" (UniqueName: \"kubernetes.io/projected/4008b367-6ed3-4514-9658-9f6916dc0cd7-kube-api-access-9ljqv\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654623 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654633 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654641 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654650 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4008b367-6ed3-4514-9658-9f6916dc0cd7-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654657 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654664 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654672 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4008b367-6ed3-4514-9658-9f6916dc0cd7-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654682 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kff49\" (UniqueName: \"kubernetes.io/projected/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-kube-api-access-kff49\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654692 4907 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.654717 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.661719 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.673631 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.755824 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"acc60f91-13bd-4e6a-9f47-4feced60618a\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.756187 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-config-data\") pod \"acc60f91-13bd-4e6a-9f47-4feced60618a\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.756286 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwn9c\" (UniqueName: \"kubernetes.io/projected/acc60f91-13bd-4e6a-9f47-4feced60618a-kube-api-access-hwn9c\") pod \"acc60f91-13bd-4e6a-9f47-4feced60618a\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.756387 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-httpd-run\") pod \"acc60f91-13bd-4e6a-9f47-4feced60618a\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.756477 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-scripts\") pod \"acc60f91-13bd-4e6a-9f47-4feced60618a\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.756569 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-logs\") pod \"acc60f91-13bd-4e6a-9f47-4feced60618a\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.756683 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-combined-ca-bundle\") pod \"acc60f91-13bd-4e6a-9f47-4feced60618a\" (UID: \"acc60f91-13bd-4e6a-9f47-4feced60618a\") " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.756890 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "acc60f91-13bd-4e6a-9f47-4feced60618a" (UID: "acc60f91-13bd-4e6a-9f47-4feced60618a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.757136 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.757206 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.763853 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc60f91-13bd-4e6a-9f47-4feced60618a-kube-api-access-hwn9c" (OuterVolumeSpecName: "kube-api-access-hwn9c") pod "acc60f91-13bd-4e6a-9f47-4feced60618a" (UID: "acc60f91-13bd-4e6a-9f47-4feced60618a"). InnerVolumeSpecName "kube-api-access-hwn9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.764059 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-logs" (OuterVolumeSpecName: "logs") pod "acc60f91-13bd-4e6a-9f47-4feced60618a" (UID: "acc60f91-13bd-4e6a-9f47-4feced60618a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.766258 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "acc60f91-13bd-4e6a-9f47-4feced60618a" (UID: "acc60f91-13bd-4e6a-9f47-4feced60618a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.803901 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-scripts" (OuterVolumeSpecName: "scripts") pod "acc60f91-13bd-4e6a-9f47-4feced60618a" (UID: "acc60f91-13bd-4e6a-9f47-4feced60618a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.804603 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acc60f91-13bd-4e6a-9f47-4feced60618a" (UID: "acc60f91-13bd-4e6a-9f47-4feced60618a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.845581 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-config-data" (OuterVolumeSpecName: "config-data") pod "acc60f91-13bd-4e6a-9f47-4feced60618a" (UID: "acc60f91-13bd-4e6a-9f47-4feced60618a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.858749 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwn9c\" (UniqueName: \"kubernetes.io/projected/acc60f91-13bd-4e6a-9f47-4feced60618a-kube-api-access-hwn9c\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.858778 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.858787 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acc60f91-13bd-4e6a-9f47-4feced60618a-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.858795 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.858820 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.858830 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc60f91-13bd-4e6a-9f47-4feced60618a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.875127 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.905565 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.912595 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.931786 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932459 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0a99ee-a820-4314-b0fa-6c42953f8c9a" containerName="keystone-bootstrap" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932494 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0a99ee-a820-4314-b0fa-6c42953f8c9a" containerName="keystone-bootstrap" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932516 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bef797-5e36-4bfd-9555-8fc0887067fa" containerName="dnsmasq-dns" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932524 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bef797-5e36-4bfd-9555-8fc0887067fa" containerName="dnsmasq-dns" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932543 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerName="glance-httpd" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932551 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerName="glance-httpd" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932577 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562823d8-4ea2-43cd-8e94-5c1c154de988" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932585 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="562823d8-4ea2-43cd-8e94-5c1c154de988" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932609 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bef797-5e36-4bfd-9555-8fc0887067fa" containerName="init" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932617 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bef797-5e36-4bfd-9555-8fc0887067fa" containerName="init" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932636 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ff2dda-76aa-4595-bd0c-e69456106650" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932646 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ff2dda-76aa-4595-bd0c-e69456106650" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932682 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerName="glance-log" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932694 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerName="glance-log" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932733 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2338e4d-2560-4e36-a309-4d46241e3c1b" containerName="init" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932742 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2338e4d-2560-4e36-a309-4d46241e3c1b" containerName="init" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932753 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerName="glance-httpd" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932761 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerName="glance-httpd" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932780 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a28eaa-918c-4756-9936-1d724410617c" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932788 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a28eaa-918c-4756-9936-1d724410617c" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: E1009 19:46:09.932814 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerName="glance-log" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.932822 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerName="glance-log" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933300 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerName="glance-log" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933355 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="562823d8-4ea2-43cd-8e94-5c1c154de988" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933365 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad0a99ee-a820-4314-b0fa-6c42953f8c9a" containerName="keystone-bootstrap" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933390 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2338e4d-2560-4e36-a309-4d46241e3c1b" containerName="init" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933406 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a28eaa-918c-4756-9936-1d724410617c" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933422 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerName="glance-log" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933436 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ff2dda-76aa-4595-bd0c-e69456106650" containerName="mariadb-account-create" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933579 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" containerName="glance-httpd" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933614 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bef797-5e36-4bfd-9555-8fc0887067fa" containerName="dnsmasq-dns" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.933638 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" containerName="glance-httpd" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.935172 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.939379 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.939637 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.960771 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:09 crc kubenswrapper[4907]: I1009 19:46:09.990177 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.062233 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.062284 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.062329 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.062508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.062551 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5drr2\" (UniqueName: \"kubernetes.io/projected/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-kube-api-access-5drr2\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.062654 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.062699 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.062718 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.163811 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle\") pod \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\" (UID: \"ad0a99ee-a820-4314-b0fa-6c42953f8c9a\") " Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165034 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165068 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165165 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165201 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165262 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165307 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165333 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5drr2\" (UniqueName: \"kubernetes.io/projected/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-kube-api-access-5drr2\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165376 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.165902 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.166970 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.167233 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad0a99ee-a820-4314-b0fa-6c42953f8c9a" (UID: "ad0a99ee-a820-4314-b0fa-6c42953f8c9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.168163 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.173024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.173925 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.174121 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.179213 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.191349 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5drr2\" (UniqueName: \"kubernetes.io/projected/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-kube-api-access-5drr2\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.219356 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.233918 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.271345 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.274024 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad0a99ee-a820-4314-b0fa-6c42953f8c9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.347602 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-9tqbs"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.347822 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" podUID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" containerName="dnsmasq-dns" containerID="cri-o://81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6" gracePeriod=10 Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.529582 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cf74d09-587e-410e-b450-e4d5206d4f55","Type":"ContainerStarted","Data":"e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020"} Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.531265 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5j4pr" event={"ID":"39b99c09-f14d-462f-a4fa-e2555b429611","Type":"ContainerStarted","Data":"7a3ecf55a431fb96c7ac1f01905ed2d621dfee8f504d89a6ed9aaa4428e541ac"} Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.551318 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"acc60f91-13bd-4e6a-9f47-4feced60618a","Type":"ContainerDied","Data":"221c5b32aaffe1cdf3ed89fdc7c3ef202a688225842728d9cc90e5e44da0e8c8"} Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.551372 4907 scope.go:117] "RemoveContainer" containerID="bd0b5bd89c2825ba38fea7b50ba7dea3faa04d1aca19b1f3dd361e34161287c3" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.551618 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.579507 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-5j4pr" podStartSLOduration=3.099080418 podStartE2EDuration="11.579489756s" podCreationTimestamp="2025-10-09 19:45:59 +0000 UTC" firstStartedPulling="2025-10-09 19:46:00.784908537 +0000 UTC m=+1046.316876026" lastFinishedPulling="2025-10-09 19:46:09.265317875 +0000 UTC m=+1054.797285364" observedRunningTime="2025-10-09 19:46:10.574061802 +0000 UTC m=+1056.106029301" watchObservedRunningTime="2025-10-09 19:46:10.579489756 +0000 UTC m=+1056.111457245" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.624527 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.626886 4907 scope.go:117] "RemoveContainer" containerID="46c8f79491335d3b20733f1d0337dab83da46e25468c314d7518c2723fa5530c" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.633137 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.643687 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-q5sjd"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.665747 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-q5sjd"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.675957 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.677449 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.684076 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.687949 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.694797 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.747137 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nw9fr"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.748410 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.753371 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.753849 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.754342 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.754532 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9tb58" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.760128 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nw9fr"] Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.785014 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.785071 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhpzt\" (UniqueName: \"kubernetes.io/projected/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-kube-api-access-fhpzt\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.785104 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.785121 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-scripts\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.785148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.785170 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-config-data\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.785185 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-logs\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.787080 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.888744 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxh2\" (UniqueName: \"kubernetes.io/projected/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-kube-api-access-4rxh2\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.888843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.888864 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-combined-ca-bundle\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.888904 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhpzt\" (UniqueName: \"kubernetes.io/projected/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-kube-api-access-fhpzt\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.888943 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.888966 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-scripts\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.888996 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.889020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-credential-keys\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.889048 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-config-data\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.889063 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-logs\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.889079 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-config-data\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.889109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-fernet-keys\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.889143 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.889165 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-scripts\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.889533 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.892670 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-logs\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.892976 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.899538 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-scripts\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.899580 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.899601 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.901529 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-config-data\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.917170 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhpzt\" (UniqueName: \"kubernetes.io/projected/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-kube-api-access-fhpzt\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.985644 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " pod="openstack/glance-default-external-api-0" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.993056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-credential-keys\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.993089 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-config-data\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.993130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-fernet-keys\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.993161 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-scripts\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.993209 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxh2\" (UniqueName: \"kubernetes.io/projected/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-kube-api-access-4rxh2\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.993264 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-combined-ca-bundle\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.997636 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-fernet-keys\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.997853 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-config-data\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:10 crc kubenswrapper[4907]: I1009 19:46:10.998592 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-credential-keys\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.000451 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-combined-ca-bundle\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.003901 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-scripts\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.013714 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxh2\" (UniqueName: \"kubernetes.io/projected/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-kube-api-access-4rxh2\") pod \"keystone-bootstrap-nw9fr\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.064091 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.077030 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.124707 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.180676 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4008b367-6ed3-4514-9658-9f6916dc0cd7" path="/var/lib/kubelet/pods/4008b367-6ed3-4514-9658-9f6916dc0cd7/volumes" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.185747 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc60f91-13bd-4e6a-9f47-4feced60618a" path="/var/lib/kubelet/pods/acc60f91-13bd-4e6a-9f47-4feced60618a/volumes" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.186765 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0a99ee-a820-4314-b0fa-6c42953f8c9a" path="/var/lib/kubelet/pods/ad0a99ee-a820-4314-b0fa-6c42953f8c9a/volumes" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.195484 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.196045 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-nb\") pod \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.196118 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-svc\") pod \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.196142 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w5z4\" (UniqueName: \"kubernetes.io/projected/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-kube-api-access-5w5z4\") pod \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.196241 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-swift-storage-0\") pod \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.196304 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-sb\") pod \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.196320 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-config\") pod \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\" (UID: \"7d55cd3b-d20a-4307-a73d-f6f3fb16f715\") " Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.200621 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-kube-api-access-5w5z4" (OuterVolumeSpecName: "kube-api-access-5w5z4") pod "7d55cd3b-d20a-4307-a73d-f6f3fb16f715" (UID: "7d55cd3b-d20a-4307-a73d-f6f3fb16f715"). InnerVolumeSpecName "kube-api-access-5w5z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.268945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d55cd3b-d20a-4307-a73d-f6f3fb16f715" (UID: "7d55cd3b-d20a-4307-a73d-f6f3fb16f715"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.274432 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7d55cd3b-d20a-4307-a73d-f6f3fb16f715" (UID: "7d55cd3b-d20a-4307-a73d-f6f3fb16f715"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.283228 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d55cd3b-d20a-4307-a73d-f6f3fb16f715" (UID: "7d55cd3b-d20a-4307-a73d-f6f3fb16f715"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.288229 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-config" (OuterVolumeSpecName: "config") pod "7d55cd3b-d20a-4307-a73d-f6f3fb16f715" (UID: "7d55cd3b-d20a-4307-a73d-f6f3fb16f715"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.295449 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d55cd3b-d20a-4307-a73d-f6f3fb16f715" (UID: "7d55cd3b-d20a-4307-a73d-f6f3fb16f715"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.298523 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.298552 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.298560 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.298570 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.298580 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w5z4\" (UniqueName: \"kubernetes.io/projected/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-kube-api-access-5w5z4\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.298595 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d55cd3b-d20a-4307-a73d-f6f3fb16f715-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.565770 4907 generic.go:334] "Generic (PLEG): container finished" podID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" containerID="81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6" exitCode=0 Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.565827 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" event={"ID":"7d55cd3b-d20a-4307-a73d-f6f3fb16f715","Type":"ContainerDied","Data":"81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6"} Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.565852 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" event={"ID":"7d55cd3b-d20a-4307-a73d-f6f3fb16f715","Type":"ContainerDied","Data":"ec939bc83616eb1fa1efb5349e487b6093fdede24f4928694c5e9c17ab3538b3"} Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.565868 4907 scope.go:117] "RemoveContainer" containerID="81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.565976 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-9tqbs" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.567513 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a","Type":"ContainerStarted","Data":"1ae70108a38d472cef5b39455a8cb75ce70b0706b8cd86e79d03c3ee6fc027cd"} Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.623797 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-9tqbs"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.661204 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-9tqbs"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.688868 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-8h99z"] Oct 09 19:46:11 crc kubenswrapper[4907]: E1009 19:46:11.689227 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" containerName="dnsmasq-dns" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.689243 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" containerName="dnsmasq-dns" Oct 09 19:46:11 crc kubenswrapper[4907]: E1009 19:46:11.689277 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" containerName="init" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.689284 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" containerName="init" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.689457 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" containerName="dnsmasq-dns" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.690014 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.693324 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.693648 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.693806 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lj7xz" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.710956 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8h99z"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.745997 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.754044 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nw9fr"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.808114 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba292ca5-579c-4a89-b291-53bd3ef8d744-etc-machine-id\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.808210 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-scripts\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.808423 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-db-sync-config-data\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.808525 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-config-data\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.808552 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppp76\" (UniqueName: \"kubernetes.io/projected/ba292ca5-579c-4a89-b291-53bd3ef8d744-kube-api-access-ppp76\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.808677 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-combined-ca-bundle\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.845934 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-p2jqv"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.847005 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.851817 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8wdm6"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.853001 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.855294 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.855667 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gnp5d" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.855795 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.856341 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nzb4b" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.856423 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.873134 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p2jqv"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.884721 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8wdm6"] Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910138 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-db-sync-config-data\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910191 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-config-data\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910213 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppp76\" (UniqueName: \"kubernetes.io/projected/ba292ca5-579c-4a89-b291-53bd3ef8d744-kube-api-access-ppp76\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910243 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-config\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910259 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb9lh\" (UniqueName: \"kubernetes.io/projected/515d2cd8-8594-45c9-82c4-2605da80d58c-kube-api-access-pb9lh\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910287 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-combined-ca-bundle\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910307 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-combined-ca-bundle\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910330 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba292ca5-579c-4a89-b291-53bd3ef8d744-etc-machine-id\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910357 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-combined-ca-bundle\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910388 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-db-sync-config-data\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910426 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-scripts\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.910450 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rbnw\" (UniqueName: \"kubernetes.io/projected/574e169c-edb6-446d-be0e-7075ec99ebb1-kube-api-access-7rbnw\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.911510 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba292ca5-579c-4a89-b291-53bd3ef8d744-etc-machine-id\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.929356 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-combined-ca-bundle\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.929908 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-config-data\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.929913 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppp76\" (UniqueName: \"kubernetes.io/projected/ba292ca5-579c-4a89-b291-53bd3ef8d744-kube-api-access-ppp76\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.930341 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-scripts\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:11 crc kubenswrapper[4907]: I1009 19:46:11.931764 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-db-sync-config-data\") pod \"cinder-db-sync-8h99z\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.012956 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-config\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.013449 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb9lh\" (UniqueName: \"kubernetes.io/projected/515d2cd8-8594-45c9-82c4-2605da80d58c-kube-api-access-pb9lh\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.013514 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-combined-ca-bundle\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.013573 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-combined-ca-bundle\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.013621 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-db-sync-config-data\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.013683 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rbnw\" (UniqueName: \"kubernetes.io/projected/574e169c-edb6-446d-be0e-7075ec99ebb1-kube-api-access-7rbnw\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.017046 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-combined-ca-bundle\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.019714 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-db-sync-config-data\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.020387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-config\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:12 crc kubenswrapper[4907]: W1009 19:46:12.026024 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb076f1dc_3d4e_4be1_96d6_6e0a8229ff06.slice/crio-d16348952eb3a0bcde945ff3096e331541f6e438e75144727424386aaf54be59 WatchSource:0}: Error finding container d16348952eb3a0bcde945ff3096e331541f6e438e75144727424386aaf54be59: Status 404 returned error can't find the container with id d16348952eb3a0bcde945ff3096e331541f6e438e75144727424386aaf54be59 Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.031711 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-combined-ca-bundle\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.034074 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb9lh\" (UniqueName: \"kubernetes.io/projected/515d2cd8-8594-45c9-82c4-2605da80d58c-kube-api-access-pb9lh\") pod \"neutron-db-sync-p2jqv\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:12 crc kubenswrapper[4907]: W1009 19:46:12.038758 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3946131f_dcc0_4dd6_bd57_2e7afbebeb78.slice/crio-2af2c0f3d20e649783eaab0b48ff358c303c504eef8faf5ec929e3d33ed9bf99 WatchSource:0}: Error finding container 2af2c0f3d20e649783eaab0b48ff358c303c504eef8faf5ec929e3d33ed9bf99: Status 404 returned error can't find the container with id 2af2c0f3d20e649783eaab0b48ff358c303c504eef8faf5ec929e3d33ed9bf99 Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.039288 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rbnw\" (UniqueName: \"kubernetes.io/projected/574e169c-edb6-446d-be0e-7075ec99ebb1-kube-api-access-7rbnw\") pod \"barbican-db-sync-8wdm6\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.056340 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.088656 4907 scope.go:117] "RemoveContainer" containerID="4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.141162 4907 scope.go:117] "RemoveContainer" containerID="81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6" Oct 09 19:46:12 crc kubenswrapper[4907]: E1009 19:46:12.144888 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6\": container with ID starting with 81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6 not found: ID does not exist" containerID="81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.144936 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6"} err="failed to get container status \"81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6\": rpc error: code = NotFound desc = could not find container \"81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6\": container with ID starting with 81a45f3e509575d61f1464b72603d7d3fce1b67d9acf44bed81ad402b4d15bc6 not found: ID does not exist" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.144967 4907 scope.go:117] "RemoveContainer" containerID="4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96" Oct 09 19:46:12 crc kubenswrapper[4907]: E1009 19:46:12.145289 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96\": container with ID starting with 4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96 not found: ID does not exist" containerID="4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.145323 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96"} err="failed to get container status \"4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96\": rpc error: code = NotFound desc = could not find container \"4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96\": container with ID starting with 4d1d68dc7abc25ef34bd9f723f58d12fdd5082c5c77629cdb91d423bc78cbb96 not found: ID does not exist" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.181625 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.212976 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.581844 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nw9fr" event={"ID":"3946131f-dcc0-4dd6-bd57-2e7afbebeb78","Type":"ContainerStarted","Data":"9083c9c99d7b63e52c8565a521f8f36b93e578a16283f6084296af5850817f5c"} Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.582054 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nw9fr" event={"ID":"3946131f-dcc0-4dd6-bd57-2e7afbebeb78","Type":"ContainerStarted","Data":"2af2c0f3d20e649783eaab0b48ff358c303c504eef8faf5ec929e3d33ed9bf99"} Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.594291 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06","Type":"ContainerStarted","Data":"d16348952eb3a0bcde945ff3096e331541f6e438e75144727424386aaf54be59"} Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.597768 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cf74d09-587e-410e-b450-e4d5206d4f55","Type":"ContainerStarted","Data":"e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e"} Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.603939 4907 generic.go:334] "Generic (PLEG): container finished" podID="39b99c09-f14d-462f-a4fa-e2555b429611" containerID="7a3ecf55a431fb96c7ac1f01905ed2d621dfee8f504d89a6ed9aaa4428e541ac" exitCode=0 Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.604072 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5j4pr" event={"ID":"39b99c09-f14d-462f-a4fa-e2555b429611","Type":"ContainerDied","Data":"7a3ecf55a431fb96c7ac1f01905ed2d621dfee8f504d89a6ed9aaa4428e541ac"} Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.608258 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a","Type":"ContainerStarted","Data":"c5bc7122d48b7c9b4bc650e0224ec9c293ab19affb07db61a86db0c52367167b"} Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.615798 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nw9fr" podStartSLOduration=2.615782691 podStartE2EDuration="2.615782691s" podCreationTimestamp="2025-10-09 19:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:12.611309001 +0000 UTC m=+1058.143276490" watchObservedRunningTime="2025-10-09 19:46:12.615782691 +0000 UTC m=+1058.147750180" Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.653057 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8h99z"] Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.705190 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8wdm6"] Oct 09 19:46:12 crc kubenswrapper[4907]: I1009 19:46:12.787893 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p2jqv"] Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.169832 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d55cd3b-d20a-4307-a73d-f6f3fb16f715" path="/var/lib/kubelet/pods/7d55cd3b-d20a-4307-a73d-f6f3fb16f715/volumes" Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.650999 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06","Type":"ContainerStarted","Data":"86d032acf14b35b8e72c6120c149a990a98c8ede352d3378c1e4ab843847125d"} Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.657661 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wdm6" event={"ID":"574e169c-edb6-446d-be0e-7075ec99ebb1","Type":"ContainerStarted","Data":"7801606e353c70b892cfd80826ba7f3df44bd8ca5c446b550fe074ff5df3fa2e"} Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.668405 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8h99z" event={"ID":"ba292ca5-579c-4a89-b291-53bd3ef8d744","Type":"ContainerStarted","Data":"07c9bc34105f59accf2e352ed76bf26090bdcd26d3ef763df6cc85a0e33574dd"} Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.685623 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a","Type":"ContainerStarted","Data":"6de742bdf70c0ce344dd3a058bb640a743aabfdefdf1c888528251a3b365dc53"} Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.689042 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p2jqv" event={"ID":"515d2cd8-8594-45c9-82c4-2605da80d58c","Type":"ContainerStarted","Data":"c0f9ea1cb6a858298a5179a87d72088806c5152b4153fc9f3bde683e86fe5a68"} Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.689068 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p2jqv" event={"ID":"515d2cd8-8594-45c9-82c4-2605da80d58c","Type":"ContainerStarted","Data":"6f67c9dc1c994e411fd56a5777d356f7a7ddfc061e9c7ee1e42886d22656339c"} Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.726503 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.726484153 podStartE2EDuration="4.726484153s" podCreationTimestamp="2025-10-09 19:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:13.717145684 +0000 UTC m=+1059.249113173" watchObservedRunningTime="2025-10-09 19:46:13.726484153 +0000 UTC m=+1059.258451662" Oct 09 19:46:13 crc kubenswrapper[4907]: I1009 19:46:13.750562 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-p2jqv" podStartSLOduration=2.750544305 podStartE2EDuration="2.750544305s" podCreationTimestamp="2025-10-09 19:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:13.742819575 +0000 UTC m=+1059.274787074" watchObservedRunningTime="2025-10-09 19:46:13.750544305 +0000 UTC m=+1059.282511794" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.116213 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5j4pr" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.170351 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-config-data\") pod \"39b99c09-f14d-462f-a4fa-e2555b429611\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.170410 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-scripts\") pod \"39b99c09-f14d-462f-a4fa-e2555b429611\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.170513 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b99c09-f14d-462f-a4fa-e2555b429611-logs\") pod \"39b99c09-f14d-462f-a4fa-e2555b429611\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.170555 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-combined-ca-bundle\") pod \"39b99c09-f14d-462f-a4fa-e2555b429611\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.170640 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5skn\" (UniqueName: \"kubernetes.io/projected/39b99c09-f14d-462f-a4fa-e2555b429611-kube-api-access-m5skn\") pod \"39b99c09-f14d-462f-a4fa-e2555b429611\" (UID: \"39b99c09-f14d-462f-a4fa-e2555b429611\") " Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.177329 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39b99c09-f14d-462f-a4fa-e2555b429611-logs" (OuterVolumeSpecName: "logs") pod "39b99c09-f14d-462f-a4fa-e2555b429611" (UID: "39b99c09-f14d-462f-a4fa-e2555b429611"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.180888 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-scripts" (OuterVolumeSpecName: "scripts") pod "39b99c09-f14d-462f-a4fa-e2555b429611" (UID: "39b99c09-f14d-462f-a4fa-e2555b429611"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.184399 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b99c09-f14d-462f-a4fa-e2555b429611-kube-api-access-m5skn" (OuterVolumeSpecName: "kube-api-access-m5skn") pod "39b99c09-f14d-462f-a4fa-e2555b429611" (UID: "39b99c09-f14d-462f-a4fa-e2555b429611"). InnerVolumeSpecName "kube-api-access-m5skn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.211191 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-config-data" (OuterVolumeSpecName: "config-data") pod "39b99c09-f14d-462f-a4fa-e2555b429611" (UID: "39b99c09-f14d-462f-a4fa-e2555b429611"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.238588 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39b99c09-f14d-462f-a4fa-e2555b429611" (UID: "39b99c09-f14d-462f-a4fa-e2555b429611"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.273292 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.273328 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.273337 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39b99c09-f14d-462f-a4fa-e2555b429611-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.273345 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39b99c09-f14d-462f-a4fa-e2555b429611-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.273357 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5skn\" (UniqueName: \"kubernetes.io/projected/39b99c09-f14d-462f-a4fa-e2555b429611-kube-api-access-m5skn\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.726978 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9c9b847d4-2fhz2"] Oct 09 19:46:14 crc kubenswrapper[4907]: E1009 19:46:14.727325 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b99c09-f14d-462f-a4fa-e2555b429611" containerName="placement-db-sync" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.727336 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b99c09-f14d-462f-a4fa-e2555b429611" containerName="placement-db-sync" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.727539 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b99c09-f14d-462f-a4fa-e2555b429611" containerName="placement-db-sync" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.728381 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.734231 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.734250 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.734585 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5j4pr" event={"ID":"39b99c09-f14d-462f-a4fa-e2555b429611","Type":"ContainerDied","Data":"6dfac73e0292bfb61a516d4bc4d773c29cf1c35d47f3d89a3fb7a9bac9c4c8ca"} Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.734619 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dfac73e0292bfb61a516d4bc4d773c29cf1c35d47f3d89a3fb7a9bac9c4c8ca" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.734660 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5j4pr" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.738323 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9c9b847d4-2fhz2"] Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.748376 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06","Type":"ContainerStarted","Data":"898f4310884167c2cbef6678a0d6f3533d2d231c3a78961160db814ca10a3361"} Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.882643 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3a3e5f-5651-4db7-975d-9ee766a36485-logs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.883104 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6cpw\" (UniqueName: \"kubernetes.io/projected/ef3a3e5f-5651-4db7-975d-9ee766a36485-kube-api-access-n6cpw\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.883127 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-public-tls-certs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.883156 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-combined-ca-bundle\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.883203 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-scripts\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.883232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-internal-tls-certs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.883256 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-config-data\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.985266 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6cpw\" (UniqueName: \"kubernetes.io/projected/ef3a3e5f-5651-4db7-975d-9ee766a36485-kube-api-access-n6cpw\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.985333 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-public-tls-certs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.985359 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-combined-ca-bundle\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.985411 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-scripts\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.985454 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-internal-tls-certs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.985502 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-config-data\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.985532 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3a3e5f-5651-4db7-975d-9ee766a36485-logs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.985993 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef3a3e5f-5651-4db7-975d-9ee766a36485-logs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.994150 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-scripts\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.995017 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-internal-tls-certs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:14 crc kubenswrapper[4907]: I1009 19:46:14.995679 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-config-data\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:15 crc kubenswrapper[4907]: I1009 19:46:15.004031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-public-tls-certs\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:15 crc kubenswrapper[4907]: I1009 19:46:15.004123 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3a3e5f-5651-4db7-975d-9ee766a36485-combined-ca-bundle\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:15 crc kubenswrapper[4907]: I1009 19:46:15.009660 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6cpw\" (UniqueName: \"kubernetes.io/projected/ef3a3e5f-5651-4db7-975d-9ee766a36485-kube-api-access-n6cpw\") pod \"placement-9c9b847d4-2fhz2\" (UID: \"ef3a3e5f-5651-4db7-975d-9ee766a36485\") " pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:15 crc kubenswrapper[4907]: I1009 19:46:15.055414 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:15 crc kubenswrapper[4907]: I1009 19:46:15.195064 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.195046118 podStartE2EDuration="5.195046118s" podCreationTimestamp="2025-10-09 19:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:14.776055302 +0000 UTC m=+1060.308022811" watchObservedRunningTime="2025-10-09 19:46:15.195046118 +0000 UTC m=+1060.727013607" Oct 09 19:46:15 crc kubenswrapper[4907]: I1009 19:46:15.634275 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9c9b847d4-2fhz2"] Oct 09 19:46:15 crc kubenswrapper[4907]: W1009 19:46:15.641577 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef3a3e5f_5651_4db7_975d_9ee766a36485.slice/crio-9d99039b7c51d02c14a3ce5f1963a50ef66715e8fe21c5bd9f08db0e2e017b38 WatchSource:0}: Error finding container 9d99039b7c51d02c14a3ce5f1963a50ef66715e8fe21c5bd9f08db0e2e017b38: Status 404 returned error can't find the container with id 9d99039b7c51d02c14a3ce5f1963a50ef66715e8fe21c5bd9f08db0e2e017b38 Oct 09 19:46:15 crc kubenswrapper[4907]: I1009 19:46:15.846515 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9c9b847d4-2fhz2" event={"ID":"ef3a3e5f-5651-4db7-975d-9ee766a36485","Type":"ContainerStarted","Data":"9d99039b7c51d02c14a3ce5f1963a50ef66715e8fe21c5bd9f08db0e2e017b38"} Oct 09 19:46:16 crc kubenswrapper[4907]: I1009 19:46:16.865809 4907 generic.go:334] "Generic (PLEG): container finished" podID="3946131f-dcc0-4dd6-bd57-2e7afbebeb78" containerID="9083c9c99d7b63e52c8565a521f8f36b93e578a16283f6084296af5850817f5c" exitCode=0 Oct 09 19:46:16 crc kubenswrapper[4907]: I1009 19:46:16.865860 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nw9fr" event={"ID":"3946131f-dcc0-4dd6-bd57-2e7afbebeb78","Type":"ContainerDied","Data":"9083c9c99d7b63e52c8565a521f8f36b93e578a16283f6084296af5850817f5c"} Oct 09 19:46:20 crc kubenswrapper[4907]: I1009 19:46:20.272610 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:20 crc kubenswrapper[4907]: I1009 19:46:20.272950 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:20 crc kubenswrapper[4907]: I1009 19:46:20.319312 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:20 crc kubenswrapper[4907]: I1009 19:46:20.326721 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:20 crc kubenswrapper[4907]: I1009 19:46:20.898676 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:20 crc kubenswrapper[4907]: I1009 19:46:20.898959 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:21 crc kubenswrapper[4907]: I1009 19:46:21.065124 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 19:46:21 crc kubenswrapper[4907]: I1009 19:46:21.065164 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 19:46:21 crc kubenswrapper[4907]: I1009 19:46:21.094384 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 19:46:21 crc kubenswrapper[4907]: I1009 19:46:21.109452 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 19:46:21 crc kubenswrapper[4907]: I1009 19:46:21.905616 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 19:46:21 crc kubenswrapper[4907]: I1009 19:46:21.905663 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.823368 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.915922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nw9fr" event={"ID":"3946131f-dcc0-4dd6-bd57-2e7afbebeb78","Type":"ContainerDied","Data":"2af2c0f3d20e649783eaab0b48ff358c303c504eef8faf5ec929e3d33ed9bf99"} Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.915985 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2af2c0f3d20e649783eaab0b48ff358c303c504eef8faf5ec929e3d33ed9bf99" Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.915944 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nw9fr" Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.959645 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-config-data\") pod \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.959732 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rxh2\" (UniqueName: \"kubernetes.io/projected/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-kube-api-access-4rxh2\") pod \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.959794 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-combined-ca-bundle\") pod \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.959818 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-scripts\") pod \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.959971 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-fernet-keys\") pod \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.959992 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-credential-keys\") pod \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\" (UID: \"3946131f-dcc0-4dd6-bd57-2e7afbebeb78\") " Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.967483 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3946131f-dcc0-4dd6-bd57-2e7afbebeb78" (UID: "3946131f-dcc0-4dd6-bd57-2e7afbebeb78"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.968444 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-scripts" (OuterVolumeSpecName: "scripts") pod "3946131f-dcc0-4dd6-bd57-2e7afbebeb78" (UID: "3946131f-dcc0-4dd6-bd57-2e7afbebeb78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.976643 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3946131f-dcc0-4dd6-bd57-2e7afbebeb78" (UID: "3946131f-dcc0-4dd6-bd57-2e7afbebeb78"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.982289 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-kube-api-access-4rxh2" (OuterVolumeSpecName: "kube-api-access-4rxh2") pod "3946131f-dcc0-4dd6-bd57-2e7afbebeb78" (UID: "3946131f-dcc0-4dd6-bd57-2e7afbebeb78"). InnerVolumeSpecName "kube-api-access-4rxh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:22 crc kubenswrapper[4907]: I1009 19:46:22.997762 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-config-data" (OuterVolumeSpecName: "config-data") pod "3946131f-dcc0-4dd6-bd57-2e7afbebeb78" (UID: "3946131f-dcc0-4dd6-bd57-2e7afbebeb78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.001561 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3946131f-dcc0-4dd6-bd57-2e7afbebeb78" (UID: "3946131f-dcc0-4dd6-bd57-2e7afbebeb78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.028689 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.029038 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.043127 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.065284 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.065317 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.065350 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.065360 4907 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.065369 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.065377 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rxh2\" (UniqueName: \"kubernetes.io/projected/3946131f-dcc0-4dd6-bd57-2e7afbebeb78-kube-api-access-4rxh2\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.884759 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.931411 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.945014 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-799f6b8dfc-hssd9"] Oct 09 19:46:23 crc kubenswrapper[4907]: E1009 19:46:23.945512 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3946131f-dcc0-4dd6-bd57-2e7afbebeb78" containerName="keystone-bootstrap" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.945538 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3946131f-dcc0-4dd6-bd57-2e7afbebeb78" containerName="keystone-bootstrap" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.945754 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3946131f-dcc0-4dd6-bd57-2e7afbebeb78" containerName="keystone-bootstrap" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.946837 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.952117 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.952274 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.952403 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9tb58" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.952117 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.954279 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-799f6b8dfc-hssd9"] Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.956111 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 19:46:23 crc kubenswrapper[4907]: I1009 19:46:23.956446 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.045601 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.086972 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-fernet-keys\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.087045 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-config-data\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.087132 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-scripts\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.087166 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-combined-ca-bundle\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.087202 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-credential-keys\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.087306 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-internal-tls-certs\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.087345 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rsp\" (UniqueName: \"kubernetes.io/projected/c720ed9b-76d3-441e-a0b6-81170e63f46f-kube-api-access-l4rsp\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.087381 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-public-tls-certs\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.188342 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-fernet-keys\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.188698 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-config-data\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.188755 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-scripts\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.188787 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-combined-ca-bundle\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.188806 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-credential-keys\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.188854 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-internal-tls-certs\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.188874 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rsp\" (UniqueName: \"kubernetes.io/projected/c720ed9b-76d3-441e-a0b6-81170e63f46f-kube-api-access-l4rsp\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.188901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-public-tls-certs\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.196191 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-scripts\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.196325 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-public-tls-certs\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.196880 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-fernet-keys\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.198377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-credential-keys\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.200090 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-internal-tls-certs\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.200980 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-config-data\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.201942 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c720ed9b-76d3-441e-a0b6-81170e63f46f-combined-ca-bundle\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.210989 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rsp\" (UniqueName: \"kubernetes.io/projected/c720ed9b-76d3-441e-a0b6-81170e63f46f-kube-api-access-l4rsp\") pod \"keystone-799f6b8dfc-hssd9\" (UID: \"c720ed9b-76d3-441e-a0b6-81170e63f46f\") " pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:24 crc kubenswrapper[4907]: I1009 19:46:24.270522 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:32 crc kubenswrapper[4907]: E1009 19:46:32.468283 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 09 19:46:32 crc kubenswrapper[4907]: E1009 19:46:32.469062 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ppp76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-8h99z_openstack(ba292ca5-579c-4a89-b291-53bd3ef8d744): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 19:46:32 crc kubenswrapper[4907]: E1009 19:46:32.470520 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-8h99z" podUID="ba292ca5-579c-4a89-b291-53bd3ef8d744" Oct 09 19:46:33 crc kubenswrapper[4907]: E1009 19:46:33.027956 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-8h99z" podUID="ba292ca5-579c-4a89-b291-53bd3ef8d744" Oct 09 19:46:33 crc kubenswrapper[4907]: E1009 19:46:33.498598 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Oct 09 19:46:33 crc kubenswrapper[4907]: E1009 19:46:33.498987 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vv2dm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(3cf74d09-587e-410e-b450-e4d5206d4f55): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 19:46:33 crc kubenswrapper[4907]: I1009 19:46:33.924654 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-799f6b8dfc-hssd9"] Oct 09 19:46:33 crc kubenswrapper[4907]: W1009 19:46:33.936582 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc720ed9b_76d3_441e_a0b6_81170e63f46f.slice/crio-a42b93fa606e0d6d9d7bdcdc70899dc28ee1381bd60cd7c0d8121e8b3c86a010 WatchSource:0}: Error finding container a42b93fa606e0d6d9d7bdcdc70899dc28ee1381bd60cd7c0d8121e8b3c86a010: Status 404 returned error can't find the container with id a42b93fa606e0d6d9d7bdcdc70899dc28ee1381bd60cd7c0d8121e8b3c86a010 Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.036374 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wdm6" event={"ID":"574e169c-edb6-446d-be0e-7075ec99ebb1","Type":"ContainerStarted","Data":"d3b3694f284983ba24a4792518cf413055575367f1043dc82dc875a1e207efd5"} Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.038805 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-799f6b8dfc-hssd9" event={"ID":"c720ed9b-76d3-441e-a0b6-81170e63f46f","Type":"ContainerStarted","Data":"a42b93fa606e0d6d9d7bdcdc70899dc28ee1381bd60cd7c0d8121e8b3c86a010"} Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.040844 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9c9b847d4-2fhz2" event={"ID":"ef3a3e5f-5651-4db7-975d-9ee766a36485","Type":"ContainerStarted","Data":"59ba95599909d8e35d11d01795dfd86ba2d58fb88ca914153c94e99c5b591d01"} Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.040870 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9c9b847d4-2fhz2" event={"ID":"ef3a3e5f-5651-4db7-975d-9ee766a36485","Type":"ContainerStarted","Data":"3005220822b56f296379e7a32759005101ddb195ce1484f0904c0f50f0292f31"} Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.040974 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.046276 4907 generic.go:334] "Generic (PLEG): container finished" podID="515d2cd8-8594-45c9-82c4-2605da80d58c" containerID="c0f9ea1cb6a858298a5179a87d72088806c5152b4153fc9f3bde683e86fe5a68" exitCode=0 Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.046310 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p2jqv" event={"ID":"515d2cd8-8594-45c9-82c4-2605da80d58c","Type":"ContainerDied","Data":"c0f9ea1cb6a858298a5179a87d72088806c5152b4153fc9f3bde683e86fe5a68"} Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.083302 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8wdm6" podStartSLOduration=2.319998925 podStartE2EDuration="23.083287223s" podCreationTimestamp="2025-10-09 19:46:11 +0000 UTC" firstStartedPulling="2025-10-09 19:46:12.722460722 +0000 UTC m=+1058.254428211" lastFinishedPulling="2025-10-09 19:46:33.48574902 +0000 UTC m=+1079.017716509" observedRunningTime="2025-10-09 19:46:34.055830698 +0000 UTC m=+1079.587798197" watchObservedRunningTime="2025-10-09 19:46:34.083287223 +0000 UTC m=+1079.615254712" Oct 09 19:46:34 crc kubenswrapper[4907]: I1009 19:46:34.093486 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9c9b847d4-2fhz2" podStartSLOduration=20.093442292 podStartE2EDuration="20.093442292s" podCreationTimestamp="2025-10-09 19:46:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:34.083841047 +0000 UTC m=+1079.615808546" watchObservedRunningTime="2025-10-09 19:46:34.093442292 +0000 UTC m=+1079.625409781" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.060273 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-799f6b8dfc-hssd9" event={"ID":"c720ed9b-76d3-441e-a0b6-81170e63f46f","Type":"ContainerStarted","Data":"6156cc0e276b9ccafa212267f55ff9b03da60c99b3acbc9be06f02b46fb4870e"} Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.060680 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.103905 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-799f6b8dfc-hssd9" podStartSLOduration=12.103870171 podStartE2EDuration="12.103870171s" podCreationTimestamp="2025-10-09 19:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:35.088117574 +0000 UTC m=+1080.620085123" watchObservedRunningTime="2025-10-09 19:46:35.103870171 +0000 UTC m=+1080.635837700" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.407574 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.609001 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb9lh\" (UniqueName: \"kubernetes.io/projected/515d2cd8-8594-45c9-82c4-2605da80d58c-kube-api-access-pb9lh\") pod \"515d2cd8-8594-45c9-82c4-2605da80d58c\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.609381 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-config\") pod \"515d2cd8-8594-45c9-82c4-2605da80d58c\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.609422 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-combined-ca-bundle\") pod \"515d2cd8-8594-45c9-82c4-2605da80d58c\" (UID: \"515d2cd8-8594-45c9-82c4-2605da80d58c\") " Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.616536 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515d2cd8-8594-45c9-82c4-2605da80d58c-kube-api-access-pb9lh" (OuterVolumeSpecName: "kube-api-access-pb9lh") pod "515d2cd8-8594-45c9-82c4-2605da80d58c" (UID: "515d2cd8-8594-45c9-82c4-2605da80d58c"). InnerVolumeSpecName "kube-api-access-pb9lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.646408 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-config" (OuterVolumeSpecName: "config") pod "515d2cd8-8594-45c9-82c4-2605da80d58c" (UID: "515d2cd8-8594-45c9-82c4-2605da80d58c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.650618 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "515d2cd8-8594-45c9-82c4-2605da80d58c" (UID: "515d2cd8-8594-45c9-82c4-2605da80d58c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.710913 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb9lh\" (UniqueName: \"kubernetes.io/projected/515d2cd8-8594-45c9-82c4-2605da80d58c-kube-api-access-pb9lh\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.710942 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:35 crc kubenswrapper[4907]: I1009 19:46:35.710952 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515d2cd8-8594-45c9-82c4-2605da80d58c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.072187 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p2jqv" event={"ID":"515d2cd8-8594-45c9-82c4-2605da80d58c","Type":"ContainerDied","Data":"6f67c9dc1c994e411fd56a5777d356f7a7ddfc061e9c7ee1e42886d22656339c"} Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.072234 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f67c9dc1c994e411fd56a5777d356f7a7ddfc061e9c7ee1e42886d22656339c" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.072271 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p2jqv" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.073757 4907 generic.go:334] "Generic (PLEG): container finished" podID="574e169c-edb6-446d-be0e-7075ec99ebb1" containerID="d3b3694f284983ba24a4792518cf413055575367f1043dc82dc875a1e207efd5" exitCode=0 Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.073811 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wdm6" event={"ID":"574e169c-edb6-446d-be0e-7075ec99ebb1","Type":"ContainerDied","Data":"d3b3694f284983ba24a4792518cf413055575367f1043dc82dc875a1e207efd5"} Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.074171 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.247037 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-s7jjc"] Oct 09 19:46:36 crc kubenswrapper[4907]: E1009 19:46:36.247501 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515d2cd8-8594-45c9-82c4-2605da80d58c" containerName="neutron-db-sync" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.247521 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="515d2cd8-8594-45c9-82c4-2605da80d58c" containerName="neutron-db-sync" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.247775 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="515d2cd8-8594-45c9-82c4-2605da80d58c" containerName="neutron-db-sync" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.248948 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.257193 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-s7jjc"] Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.298904 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.298971 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.337199 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7496575b44-md76p"] Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.338977 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.342949 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gnp5d" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.343154 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.343299 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.343516 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.360455 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7496575b44-md76p"] Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.420814 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.420877 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.420900 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.421126 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.421185 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r97p\" (UniqueName: \"kubernetes.io/projected/988dc8ef-dc2f-447c-8689-7a2e500eb773-kube-api-access-4r97p\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.421489 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-config\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.522801 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r97p\" (UniqueName: \"kubernetes.io/projected/988dc8ef-dc2f-447c-8689-7a2e500eb773-kube-api-access-4r97p\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.522848 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-ovndb-tls-certs\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.522877 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-httpd-config\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.522928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-combined-ca-bundle\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.522969 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sskbl\" (UniqueName: \"kubernetes.io/projected/16b9987f-f46c-4f23-851d-152c49a34fea-kube-api-access-sskbl\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.522991 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-config\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.523024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.523065 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.523090 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.523108 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-config\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.523130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.524338 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.525126 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-config\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.526343 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.526982 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.527326 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.544666 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r97p\" (UniqueName: \"kubernetes.io/projected/988dc8ef-dc2f-447c-8689-7a2e500eb773-kube-api-access-4r97p\") pod \"dnsmasq-dns-84b966f6c9-s7jjc\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.580794 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.624775 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-ovndb-tls-certs\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.624839 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-httpd-config\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.624889 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-combined-ca-bundle\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.624942 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sskbl\" (UniqueName: \"kubernetes.io/projected/16b9987f-f46c-4f23-851d-152c49a34fea-kube-api-access-sskbl\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.625018 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-config\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.630287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-ovndb-tls-certs\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.630632 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-combined-ca-bundle\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.632181 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-httpd-config\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.632370 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-config\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.644898 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sskbl\" (UniqueName: \"kubernetes.io/projected/16b9987f-f46c-4f23-851d-152c49a34fea-kube-api-access-sskbl\") pod \"neutron-7496575b44-md76p\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:36 crc kubenswrapper[4907]: I1009 19:46:36.672156 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:37 crc kubenswrapper[4907]: I1009 19:46:37.109804 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-s7jjc"] Oct 09 19:46:37 crc kubenswrapper[4907]: W1009 19:46:37.116778 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod988dc8ef_dc2f_447c_8689_7a2e500eb773.slice/crio-23677ad707576f92c865d6c1d1331dbde191c3e4eb1e92418c12c0c2d786713d WatchSource:0}: Error finding container 23677ad707576f92c865d6c1d1331dbde191c3e4eb1e92418c12c0c2d786713d: Status 404 returned error can't find the container with id 23677ad707576f92c865d6c1d1331dbde191c3e4eb1e92418c12c0c2d786713d Oct 09 19:46:37 crc kubenswrapper[4907]: I1009 19:46:37.301500 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7496575b44-md76p"] Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.101848 4907 generic.go:334] "Generic (PLEG): container finished" podID="988dc8ef-dc2f-447c-8689-7a2e500eb773" containerID="adb0ecd79cb061d21c5217eb84255c27468b690cbd7097c77c594ebbb010083e" exitCode=0 Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.101905 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" event={"ID":"988dc8ef-dc2f-447c-8689-7a2e500eb773","Type":"ContainerDied","Data":"adb0ecd79cb061d21c5217eb84255c27468b690cbd7097c77c594ebbb010083e"} Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.102170 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" event={"ID":"988dc8ef-dc2f-447c-8689-7a2e500eb773","Type":"ContainerStarted","Data":"23677ad707576f92c865d6c1d1331dbde191c3e4eb1e92418c12c0c2d786713d"} Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.343392 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68f5dff589-gkl29"] Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.345265 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.348988 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.354937 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.366316 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68f5dff589-gkl29"] Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.368368 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-public-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.368432 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-config\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.368575 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gqnp\" (UniqueName: \"kubernetes.io/projected/23247623-e419-4c41-a5dd-1ec60cdc8ccd-kube-api-access-6gqnp\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.368614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-ovndb-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.368647 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-combined-ca-bundle\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.368686 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-httpd-config\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.368714 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-internal-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.471899 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-httpd-config\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.471947 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-internal-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.472016 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-public-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.472043 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-config\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.472112 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gqnp\" (UniqueName: \"kubernetes.io/projected/23247623-e419-4c41-a5dd-1ec60cdc8ccd-kube-api-access-6gqnp\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.472140 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-ovndb-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.472170 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-combined-ca-bundle\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.476810 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-internal-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.477265 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-ovndb-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.479323 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-combined-ca-bundle\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.488482 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-config\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.490340 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-public-tls-certs\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.490966 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/23247623-e419-4c41-a5dd-1ec60cdc8ccd-httpd-config\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.494355 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gqnp\" (UniqueName: \"kubernetes.io/projected/23247623-e419-4c41-a5dd-1ec60cdc8ccd-kube-api-access-6gqnp\") pod \"neutron-68f5dff589-gkl29\" (UID: \"23247623-e419-4c41-a5dd-1ec60cdc8ccd\") " pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:38 crc kubenswrapper[4907]: I1009 19:46:38.672451 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:41 crc kubenswrapper[4907]: W1009 19:46:41.096636 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16b9987f_f46c_4f23_851d_152c49a34fea.slice/crio-27be74c801b4f51b3ba3039da869ab69df8ebc251d7fc6c5644e4eebbfe95e2f WatchSource:0}: Error finding container 27be74c801b4f51b3ba3039da869ab69df8ebc251d7fc6c5644e4eebbfe95e2f: Status 404 returned error can't find the container with id 27be74c801b4f51b3ba3039da869ab69df8ebc251d7fc6c5644e4eebbfe95e2f Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.142137 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7496575b44-md76p" event={"ID":"16b9987f-f46c-4f23-851d-152c49a34fea","Type":"ContainerStarted","Data":"27be74c801b4f51b3ba3039da869ab69df8ebc251d7fc6c5644e4eebbfe95e2f"} Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.143902 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wdm6" event={"ID":"574e169c-edb6-446d-be0e-7075ec99ebb1","Type":"ContainerDied","Data":"7801606e353c70b892cfd80826ba7f3df44bd8ca5c446b550fe074ff5df3fa2e"} Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.143925 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7801606e353c70b892cfd80826ba7f3df44bd8ca5c446b550fe074ff5df3fa2e" Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.212483 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.324285 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rbnw\" (UniqueName: \"kubernetes.io/projected/574e169c-edb6-446d-be0e-7075ec99ebb1-kube-api-access-7rbnw\") pod \"574e169c-edb6-446d-be0e-7075ec99ebb1\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.324425 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-db-sync-config-data\") pod \"574e169c-edb6-446d-be0e-7075ec99ebb1\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.324445 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-combined-ca-bundle\") pod \"574e169c-edb6-446d-be0e-7075ec99ebb1\" (UID: \"574e169c-edb6-446d-be0e-7075ec99ebb1\") " Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.329723 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574e169c-edb6-446d-be0e-7075ec99ebb1-kube-api-access-7rbnw" (OuterVolumeSpecName: "kube-api-access-7rbnw") pod "574e169c-edb6-446d-be0e-7075ec99ebb1" (UID: "574e169c-edb6-446d-be0e-7075ec99ebb1"). InnerVolumeSpecName "kube-api-access-7rbnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.331253 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "574e169c-edb6-446d-be0e-7075ec99ebb1" (UID: "574e169c-edb6-446d-be0e-7075ec99ebb1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.374703 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "574e169c-edb6-446d-be0e-7075ec99ebb1" (UID: "574e169c-edb6-446d-be0e-7075ec99ebb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.426940 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.426967 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574e169c-edb6-446d-be0e-7075ec99ebb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:41 crc kubenswrapper[4907]: I1009 19:46:41.426978 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rbnw\" (UniqueName: \"kubernetes.io/projected/574e169c-edb6-446d-be0e-7075ec99ebb1-kube-api-access-7rbnw\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:42 crc kubenswrapper[4907]: E1009 19:46:42.096396 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.162261 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7496575b44-md76p" event={"ID":"16b9987f-f46c-4f23-851d-152c49a34fea","Type":"ContainerStarted","Data":"80a3619fa77a23367b745d811eeb278943eee9bb68e249c257b3d1ef99d38ed4"} Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.174882 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cf74d09-587e-410e-b450-e4d5206d4f55","Type":"ContainerStarted","Data":"6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad"} Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.174971 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="ceilometer-central-agent" containerID="cri-o://e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020" gracePeriod=30 Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.175014 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="ceilometer-notification-agent" containerID="cri-o://e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e" gracePeriod=30 Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.175062 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.175406 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="proxy-httpd" containerID="cri-o://6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad" gracePeriod=30 Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.184843 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wdm6" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.184883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" event={"ID":"988dc8ef-dc2f-447c-8689-7a2e500eb773","Type":"ContainerStarted","Data":"137702f6571a83a627cf06b3ac06d87f1a0ced43e84ceeb0f49a7107a6615614"} Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.210131 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68f5dff589-gkl29"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.239363 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" podStartSLOduration=6.239344521 podStartE2EDuration="6.239344521s" podCreationTimestamp="2025-10-09 19:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:42.233322543 +0000 UTC m=+1087.765290032" watchObservedRunningTime="2025-10-09 19:46:42.239344521 +0000 UTC m=+1087.771312010" Oct 09 19:46:42 crc kubenswrapper[4907]: E1009 19:46:42.408560 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod574e169c_edb6_446d_be0e_7075ec99ebb1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod574e169c_edb6_446d_be0e_7075ec99ebb1.slice/crio-7801606e353c70b892cfd80826ba7f3df44bd8ca5c446b550fe074ff5df3fa2e\": RecentStats: unable to find data in memory cache]" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.474792 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5db85b5857-58t94"] Oct 09 19:46:42 crc kubenswrapper[4907]: E1009 19:46:42.475201 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574e169c-edb6-446d-be0e-7075ec99ebb1" containerName="barbican-db-sync" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.475212 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="574e169c-edb6-446d-be0e-7075ec99ebb1" containerName="barbican-db-sync" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.475360 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="574e169c-edb6-446d-be0e-7075ec99ebb1" containerName="barbican-db-sync" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.476200 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.487508 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7484f7b746-btdlm"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.488948 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.489574 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.489633 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nzb4b" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.490092 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.494337 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.507686 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5db85b5857-58t94"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.522611 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7484f7b746-btdlm"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.561482 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-s7jjc"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.569369 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-combined-ca-bundle\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.569496 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-config-data-custom\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.569547 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-config-data\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.569564 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj6bb\" (UniqueName: \"kubernetes.io/projected/856ed5f9-dc9b-43db-9d28-b1e400d25798-kube-api-access-fj6bb\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.569599 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/856ed5f9-dc9b-43db-9d28-b1e400d25798-logs\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.596173 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-hcwrp"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.597767 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.608075 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-hcwrp"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671088 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/856ed5f9-dc9b-43db-9d28-b1e400d25798-logs\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671438 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhh7h\" (UniqueName: \"kubernetes.io/projected/287de68c-1c57-4f07-ba04-4d0899b26673-kube-api-access-bhh7h\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671505 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-config-data-custom\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671545 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-combined-ca-bundle\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/287de68c-1c57-4f07-ba04-4d0899b26673-logs\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671651 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-combined-ca-bundle\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671671 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-config-data-custom\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671719 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-config-data\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671780 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-config-data\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.671813 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj6bb\" (UniqueName: \"kubernetes.io/projected/856ed5f9-dc9b-43db-9d28-b1e400d25798-kube-api-access-fj6bb\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.672744 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/856ed5f9-dc9b-43db-9d28-b1e400d25798-logs\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.676558 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-575bf6fcb-29gs7"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.677979 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.682940 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-config-data-custom\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.683321 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-config-data\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.687790 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.688189 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856ed5f9-dc9b-43db-9d28-b1e400d25798-combined-ca-bundle\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.692194 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-575bf6fcb-29gs7"] Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.696232 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj6bb\" (UniqueName: \"kubernetes.io/projected/856ed5f9-dc9b-43db-9d28-b1e400d25798-kube-api-access-fj6bb\") pod \"barbican-worker-5db85b5857-58t94\" (UID: \"856ed5f9-dc9b-43db-9d28-b1e400d25798\") " pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.773645 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-combined-ca-bundle\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.773824 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-config-data-custom\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.773938 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/287de68c-1c57-4f07-ba04-4d0899b26673-logs\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.774043 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d4k4\" (UniqueName: \"kubernetes.io/projected/a3160fd3-8937-418d-846b-47aff379cded-kube-api-access-4d4k4\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.774130 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-logs\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.774283 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-combined-ca-bundle\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.774391 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.774494 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-config-data\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.774631 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.774788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data-custom\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.775061 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-config\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.775116 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/287de68c-1c57-4f07-ba04-4d0899b26673-logs\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.775239 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.775342 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.775457 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhh7h\" (UniqueName: \"kubernetes.io/projected/287de68c-1c57-4f07-ba04-4d0899b26673-kube-api-access-bhh7h\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.775598 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.775686 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldz4m\" (UniqueName: \"kubernetes.io/projected/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-kube-api-access-ldz4m\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.778401 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-config-data-custom\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.778434 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-config-data\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.779452 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287de68c-1c57-4f07-ba04-4d0899b26673-combined-ca-bundle\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.790999 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhh7h\" (UniqueName: \"kubernetes.io/projected/287de68c-1c57-4f07-ba04-4d0899b26673-kube-api-access-bhh7h\") pod \"barbican-keystone-listener-7484f7b746-btdlm\" (UID: \"287de68c-1c57-4f07-ba04-4d0899b26673\") " pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.818231 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db85b5857-58t94" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.836031 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877413 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data-custom\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-config\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877533 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877567 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877594 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877611 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldz4m\" (UniqueName: \"kubernetes.io/projected/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-kube-api-access-ldz4m\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877630 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-combined-ca-bundle\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877667 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d4k4\" (UniqueName: \"kubernetes.io/projected/a3160fd3-8937-418d-846b-47aff379cded-kube-api-access-4d4k4\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877682 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-logs\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877777 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.877811 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.878816 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.879296 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.879898 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-logs\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.880758 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.881606 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-config\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.882394 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.885729 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-combined-ca-bundle\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.886514 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data-custom\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.887474 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.902206 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldz4m\" (UniqueName: \"kubernetes.io/projected/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-kube-api-access-ldz4m\") pod \"barbican-api-575bf6fcb-29gs7\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.905286 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d4k4\" (UniqueName: \"kubernetes.io/projected/a3160fd3-8937-418d-846b-47aff379cded-kube-api-access-4d4k4\") pod \"dnsmasq-dns-75c8ddd69c-hcwrp\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:42 crc kubenswrapper[4907]: I1009 19:46:42.927196 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.002849 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.255129 4907 generic.go:334] "Generic (PLEG): container finished" podID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerID="e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020" exitCode=0 Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.255479 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cf74d09-587e-410e-b450-e4d5206d4f55","Type":"ContainerDied","Data":"e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020"} Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.258625 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f5dff589-gkl29" event={"ID":"23247623-e419-4c41-a5dd-1ec60cdc8ccd","Type":"ContainerStarted","Data":"d728e1111e57f85535f6fe0fdc5b4e50edaded0cec354fa082eadf391825716f"} Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.258696 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f5dff589-gkl29" event={"ID":"23247623-e419-4c41-a5dd-1ec60cdc8ccd","Type":"ContainerStarted","Data":"0cb1fead05111eb1c68637c468cbc174b222acec9fd01f765ae35d6aaf79f7a5"} Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.258709 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f5dff589-gkl29" event={"ID":"23247623-e419-4c41-a5dd-1ec60cdc8ccd","Type":"ContainerStarted","Data":"867d10865229e73184687399995f45a20b32fd5fa74322a2d5104203726af098"} Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.259774 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.271423 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7496575b44-md76p" event={"ID":"16b9987f-f46c-4f23-851d-152c49a34fea","Type":"ContainerStarted","Data":"ee86b2e12031a669857531dfec780c7c87280b227afd9d3cd8fecabbce2e4e68"} Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.271496 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.271962 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7496575b44-md76p" Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.312459 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68f5dff589-gkl29" podStartSLOduration=5.312436798 podStartE2EDuration="5.312436798s" podCreationTimestamp="2025-10-09 19:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:43.28809336 +0000 UTC m=+1088.820060879" watchObservedRunningTime="2025-10-09 19:46:43.312436798 +0000 UTC m=+1088.844404287" Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.337559 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7496575b44-md76p" podStartSLOduration=7.337544215 podStartE2EDuration="7.337544215s" podCreationTimestamp="2025-10-09 19:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:43.312355876 +0000 UTC m=+1088.844323385" watchObservedRunningTime="2025-10-09 19:46:43.337544215 +0000 UTC m=+1088.869511704" Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.500423 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5db85b5857-58t94"] Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.680972 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7484f7b746-btdlm"] Oct 09 19:46:43 crc kubenswrapper[4907]: W1009 19:46:43.683576 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod287de68c_1c57_4f07_ba04_4d0899b26673.slice/crio-254b17abb7b309899f46128500cfd2cc22dfabdd4a7ee1c8e109a06c94cec364 WatchSource:0}: Error finding container 254b17abb7b309899f46128500cfd2cc22dfabdd4a7ee1c8e109a06c94cec364: Status 404 returned error can't find the container with id 254b17abb7b309899f46128500cfd2cc22dfabdd4a7ee1c8e109a06c94cec364 Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.888073 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-hcwrp"] Oct 09 19:46:43 crc kubenswrapper[4907]: W1009 19:46:43.889111 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3160fd3_8937_418d_846b_47aff379cded.slice/crio-3849cb78afda476625babf74eed45b6aa425295354a5250446112fdee5309edf WatchSource:0}: Error finding container 3849cb78afda476625babf74eed45b6aa425295354a5250446112fdee5309edf: Status 404 returned error can't find the container with id 3849cb78afda476625babf74eed45b6aa425295354a5250446112fdee5309edf Oct 09 19:46:43 crc kubenswrapper[4907]: I1009 19:46:43.897183 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-575bf6fcb-29gs7"] Oct 09 19:46:43 crc kubenswrapper[4907]: W1009 19:46:43.910284 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c8c3c56_1fea_44e4_b03f_0d54ac61ab87.slice/crio-9bb70c6a6ce20661378210ed0307ee1533bf54163482107aef2b0645ed5cba9b WatchSource:0}: Error finding container 9bb70c6a6ce20661378210ed0307ee1533bf54163482107aef2b0645ed5cba9b: Status 404 returned error can't find the container with id 9bb70c6a6ce20661378210ed0307ee1533bf54163482107aef2b0645ed5cba9b Oct 09 19:46:44 crc kubenswrapper[4907]: I1009 19:46:44.279785 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db85b5857-58t94" event={"ID":"856ed5f9-dc9b-43db-9d28-b1e400d25798","Type":"ContainerStarted","Data":"e795698403e838601b0063e53c873c0c33560705314c5d5b974538c1589a2101"} Oct 09 19:46:44 crc kubenswrapper[4907]: I1009 19:46:44.282930 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" event={"ID":"287de68c-1c57-4f07-ba04-4d0899b26673","Type":"ContainerStarted","Data":"254b17abb7b309899f46128500cfd2cc22dfabdd4a7ee1c8e109a06c94cec364"} Oct 09 19:46:44 crc kubenswrapper[4907]: I1009 19:46:44.286883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575bf6fcb-29gs7" event={"ID":"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87","Type":"ContainerStarted","Data":"b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377"} Oct 09 19:46:44 crc kubenswrapper[4907]: I1009 19:46:44.286928 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575bf6fcb-29gs7" event={"ID":"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87","Type":"ContainerStarted","Data":"9bb70c6a6ce20661378210ed0307ee1533bf54163482107aef2b0645ed5cba9b"} Oct 09 19:46:44 crc kubenswrapper[4907]: I1009 19:46:44.293689 4907 generic.go:334] "Generic (PLEG): container finished" podID="a3160fd3-8937-418d-846b-47aff379cded" containerID="ec5204cd930f394043964447e7f51893e57753364973377ae21df8788796a814" exitCode=0 Oct 09 19:46:44 crc kubenswrapper[4907]: I1009 19:46:44.294925 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" event={"ID":"a3160fd3-8937-418d-846b-47aff379cded","Type":"ContainerDied","Data":"ec5204cd930f394043964447e7f51893e57753364973377ae21df8788796a814"} Oct 09 19:46:44 crc kubenswrapper[4907]: I1009 19:46:44.294950 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" event={"ID":"a3160fd3-8937-418d-846b-47aff379cded","Type":"ContainerStarted","Data":"3849cb78afda476625babf74eed45b6aa425295354a5250446112fdee5309edf"} Oct 09 19:46:44 crc kubenswrapper[4907]: I1009 19:46:44.295458 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" podUID="988dc8ef-dc2f-447c-8689-7a2e500eb773" containerName="dnsmasq-dns" containerID="cri-o://137702f6571a83a627cf06b3ac06d87f1a0ced43e84ceeb0f49a7107a6615614" gracePeriod=10 Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.056516 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cbc76cff8-xmlsw"] Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.058341 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.061509 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.061757 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.076109 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cbc76cff8-xmlsw"] Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.232434 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-config-data-custom\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.232527 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-config-data\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.232582 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9jt5\" (UniqueName: \"kubernetes.io/projected/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-kube-api-access-r9jt5\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.232652 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-internal-tls-certs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.232700 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-public-tls-certs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.232724 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-combined-ca-bundle\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.232767 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-logs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.304967 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575bf6fcb-29gs7" event={"ID":"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87","Type":"ContainerStarted","Data":"cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353"} Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.312870 4907 generic.go:334] "Generic (PLEG): container finished" podID="988dc8ef-dc2f-447c-8689-7a2e500eb773" containerID="137702f6571a83a627cf06b3ac06d87f1a0ced43e84ceeb0f49a7107a6615614" exitCode=0 Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.312959 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" event={"ID":"988dc8ef-dc2f-447c-8689-7a2e500eb773","Type":"ContainerDied","Data":"137702f6571a83a627cf06b3ac06d87f1a0ced43e84ceeb0f49a7107a6615614"} Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.318574 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" event={"ID":"a3160fd3-8937-418d-846b-47aff379cded","Type":"ContainerStarted","Data":"85558c61f6f1dbef6e741983569b6116b67a31e8447e5a5554718fece782aa80"} Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.337134 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-public-tls-certs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.337209 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-combined-ca-bundle\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.337281 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-logs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.337308 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-config-data-custom\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.337375 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-config-data\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.337439 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9jt5\" (UniqueName: \"kubernetes.io/projected/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-kube-api-access-r9jt5\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.337520 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-internal-tls-certs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.339342 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-logs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.347891 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-combined-ca-bundle\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.348077 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-config-data-custom\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.349025 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-public-tls-certs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.349818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-internal-tls-certs\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.358701 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9jt5\" (UniqueName: \"kubernetes.io/projected/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-kube-api-access-r9jt5\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.358976 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a93848-dc0b-480e-9ec9-fc16c88e00dc-config-data\") pod \"barbican-api-7cbc76cff8-xmlsw\" (UID: \"f9a93848-dc0b-480e-9ec9-fc16c88e00dc\") " pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.388395 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.683648 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.848358 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r97p\" (UniqueName: \"kubernetes.io/projected/988dc8ef-dc2f-447c-8689-7a2e500eb773-kube-api-access-4r97p\") pod \"988dc8ef-dc2f-447c-8689-7a2e500eb773\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.848526 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-nb\") pod \"988dc8ef-dc2f-447c-8689-7a2e500eb773\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.848547 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-config\") pod \"988dc8ef-dc2f-447c-8689-7a2e500eb773\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.849283 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-sb\") pod \"988dc8ef-dc2f-447c-8689-7a2e500eb773\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.849331 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-swift-storage-0\") pod \"988dc8ef-dc2f-447c-8689-7a2e500eb773\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.849361 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-svc\") pod \"988dc8ef-dc2f-447c-8689-7a2e500eb773\" (UID: \"988dc8ef-dc2f-447c-8689-7a2e500eb773\") " Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.854049 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988dc8ef-dc2f-447c-8689-7a2e500eb773-kube-api-access-4r97p" (OuterVolumeSpecName: "kube-api-access-4r97p") pod "988dc8ef-dc2f-447c-8689-7a2e500eb773" (UID: "988dc8ef-dc2f-447c-8689-7a2e500eb773"). InnerVolumeSpecName "kube-api-access-4r97p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.936033 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "988dc8ef-dc2f-447c-8689-7a2e500eb773" (UID: "988dc8ef-dc2f-447c-8689-7a2e500eb773"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.936049 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-config" (OuterVolumeSpecName: "config") pod "988dc8ef-dc2f-447c-8689-7a2e500eb773" (UID: "988dc8ef-dc2f-447c-8689-7a2e500eb773"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.937221 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "988dc8ef-dc2f-447c-8689-7a2e500eb773" (UID: "988dc8ef-dc2f-447c-8689-7a2e500eb773"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.943730 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "988dc8ef-dc2f-447c-8689-7a2e500eb773" (UID: "988dc8ef-dc2f-447c-8689-7a2e500eb773"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.946746 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "988dc8ef-dc2f-447c-8689-7a2e500eb773" (UID: "988dc8ef-dc2f-447c-8689-7a2e500eb773"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.951293 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.951333 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.951346 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.951360 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.951377 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988dc8ef-dc2f-447c-8689-7a2e500eb773-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.951389 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r97p\" (UniqueName: \"kubernetes.io/projected/988dc8ef-dc2f-447c-8689-7a2e500eb773-kube-api-access-4r97p\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:45 crc kubenswrapper[4907]: I1009 19:46:45.974976 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cbc76cff8-xmlsw"] Oct 09 19:46:46 crc kubenswrapper[4907]: I1009 19:46:46.326017 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbc76cff8-xmlsw" event={"ID":"f9a93848-dc0b-480e-9ec9-fc16c88e00dc","Type":"ContainerStarted","Data":"c95b63f6c37a2e093d9e5f99ca2998e33d9bf176d90caefc8f3f0a34fea2d6ee"} Oct 09 19:46:46 crc kubenswrapper[4907]: I1009 19:46:46.327922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" event={"ID":"988dc8ef-dc2f-447c-8689-7a2e500eb773","Type":"ContainerDied","Data":"23677ad707576f92c865d6c1d1331dbde191c3e4eb1e92418c12c0c2d786713d"} Oct 09 19:46:46 crc kubenswrapper[4907]: I1009 19:46:46.327961 4907 scope.go:117] "RemoveContainer" containerID="137702f6571a83a627cf06b3ac06d87f1a0ced43e84ceeb0f49a7107a6615614" Oct 09 19:46:46 crc kubenswrapper[4907]: I1009 19:46:46.327976 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-s7jjc" Oct 09 19:46:46 crc kubenswrapper[4907]: I1009 19:46:46.358085 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-s7jjc"] Oct 09 19:46:46 crc kubenswrapper[4907]: I1009 19:46:46.370000 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-s7jjc"] Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.163766 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988dc8ef-dc2f-447c-8689-7a2e500eb773" path="/var/lib/kubelet/pods/988dc8ef-dc2f-447c-8689-7a2e500eb773/volumes" Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.242760 4907 scope.go:117] "RemoveContainer" containerID="adb0ecd79cb061d21c5217eb84255c27468b690cbd7097c77c594ebbb010083e" Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.342956 4907 generic.go:334] "Generic (PLEG): container finished" podID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerID="e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e" exitCode=0 Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.343060 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cf74d09-587e-410e-b450-e4d5206d4f55","Type":"ContainerDied","Data":"e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e"} Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.344598 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.344643 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.375412 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-575bf6fcb-29gs7" podStartSLOduration=5.375196346 podStartE2EDuration="5.375196346s" podCreationTimestamp="2025-10-09 19:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:47.372170112 +0000 UTC m=+1092.904137591" watchObservedRunningTime="2025-10-09 19:46:47.375196346 +0000 UTC m=+1092.907163845" Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.819301 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:47 crc kubenswrapper[4907]: I1009 19:46:47.820334 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9c9b847d4-2fhz2" Oct 09 19:46:48 crc kubenswrapper[4907]: I1009 19:46:48.353562 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbc76cff8-xmlsw" event={"ID":"f9a93848-dc0b-480e-9ec9-fc16c88e00dc","Type":"ContainerStarted","Data":"61ebf2c6591db8029c6af9d60d59e96c60eb975098ceb31e25998a181adf2633"} Oct 09 19:46:48 crc kubenswrapper[4907]: I1009 19:46:48.377779 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" podStartSLOduration=6.377759601 podStartE2EDuration="6.377759601s" podCreationTimestamp="2025-10-09 19:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:48.370979845 +0000 UTC m=+1093.902947354" watchObservedRunningTime="2025-10-09 19:46:48.377759601 +0000 UTC m=+1093.909727110" Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.386757 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbc76cff8-xmlsw" event={"ID":"f9a93848-dc0b-480e-9ec9-fc16c88e00dc","Type":"ContainerStarted","Data":"2857eba1d0797c8e7c0eb0913b514934ee246a254be814889f4e959f9d4a1703"} Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.387136 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.387157 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.391316 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db85b5857-58t94" event={"ID":"856ed5f9-dc9b-43db-9d28-b1e400d25798","Type":"ContainerStarted","Data":"96c0541ca6cc74f85feeac85a12e5ee594fcdaabed29a0578cb069080ab9bfec"} Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.391354 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db85b5857-58t94" event={"ID":"856ed5f9-dc9b-43db-9d28-b1e400d25798","Type":"ContainerStarted","Data":"0b1f0a77661b9588e95d656bdb30e8a4cc13d9b970cdbf551118e7bb2bbada37"} Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.397416 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" event={"ID":"287de68c-1c57-4f07-ba04-4d0899b26673","Type":"ContainerStarted","Data":"cbfe214274fb3ed464ae967dc53061b167a2716b525a8866c7f143b506be9bcd"} Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.397456 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" event={"ID":"287de68c-1c57-4f07-ba04-4d0899b26673","Type":"ContainerStarted","Data":"bcc99f1aa5f98f65aa04c0a6e2c53b6f6516fab85317aec998bb3a7078c9776b"} Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.405569 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cbc76cff8-xmlsw" podStartSLOduration=4.405550527 podStartE2EDuration="4.405550527s" podCreationTimestamp="2025-10-09 19:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:49.402293406 +0000 UTC m=+1094.934260895" watchObservedRunningTime="2025-10-09 19:46:49.405550527 +0000 UTC m=+1094.937518016" Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.426116 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7484f7b746-btdlm" podStartSLOduration=2.364751495 podStartE2EDuration="7.426102882s" podCreationTimestamp="2025-10-09 19:46:42 +0000 UTC" firstStartedPulling="2025-10-09 19:46:43.693906121 +0000 UTC m=+1089.225873610" lastFinishedPulling="2025-10-09 19:46:48.755257508 +0000 UTC m=+1094.287224997" observedRunningTime="2025-10-09 19:46:49.424355049 +0000 UTC m=+1094.956322538" watchObservedRunningTime="2025-10-09 19:46:49.426102882 +0000 UTC m=+1094.958070371" Oct 09 19:46:49 crc kubenswrapper[4907]: I1009 19:46:49.457876 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5db85b5857-58t94" podStartSLOduration=2.20743478 podStartE2EDuration="7.457854692s" podCreationTimestamp="2025-10-09 19:46:42 +0000 UTC" firstStartedPulling="2025-10-09 19:46:43.503742999 +0000 UTC m=+1089.035710488" lastFinishedPulling="2025-10-09 19:46:48.754162911 +0000 UTC m=+1094.286130400" observedRunningTime="2025-10-09 19:46:49.444548965 +0000 UTC m=+1094.976516464" watchObservedRunningTime="2025-10-09 19:46:49.457854692 +0000 UTC m=+1094.989822181" Oct 09 19:46:50 crc kubenswrapper[4907]: I1009 19:46:50.408252 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8h99z" event={"ID":"ba292ca5-579c-4a89-b291-53bd3ef8d744","Type":"ContainerStarted","Data":"d740fb03ad983bd5492d0bdc1489013183df2e1232605cb012a7127005f6087a"} Oct 09 19:46:50 crc kubenswrapper[4907]: I1009 19:46:50.437405 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-8h99z" podStartSLOduration=3.37467181 podStartE2EDuration="39.43738923s" podCreationTimestamp="2025-10-09 19:46:11 +0000 UTC" firstStartedPulling="2025-10-09 19:46:12.693899351 +0000 UTC m=+1058.225866840" lastFinishedPulling="2025-10-09 19:46:48.756616771 +0000 UTC m=+1094.288584260" observedRunningTime="2025-10-09 19:46:50.4329058 +0000 UTC m=+1095.964873299" watchObservedRunningTime="2025-10-09 19:46:50.43738923 +0000 UTC m=+1095.969356719" Oct 09 19:46:52 crc kubenswrapper[4907]: I1009 19:46:52.928089 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:52 crc kubenswrapper[4907]: I1009 19:46:52.928722 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:46:52 crc kubenswrapper[4907]: I1009 19:46:52.991783 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dw45f"] Oct 09 19:46:52 crc kubenswrapper[4907]: I1009 19:46:52.993128 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" podUID="1fa02f8d-3656-4f83-8e33-01d053471999" containerName="dnsmasq-dns" containerID="cri-o://be07f0262303804bfbc1cb8ea506aee01e615e12e1b136f278186de95fee87fa" gracePeriod=10 Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.436216 4907 generic.go:334] "Generic (PLEG): container finished" podID="1fa02f8d-3656-4f83-8e33-01d053471999" containerID="be07f0262303804bfbc1cb8ea506aee01e615e12e1b136f278186de95fee87fa" exitCode=0 Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.436283 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" event={"ID":"1fa02f8d-3656-4f83-8e33-01d053471999","Type":"ContainerDied","Data":"be07f0262303804bfbc1cb8ea506aee01e615e12e1b136f278186de95fee87fa"} Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.436590 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" event={"ID":"1fa02f8d-3656-4f83-8e33-01d053471999","Type":"ContainerDied","Data":"4203a06969a31053ae160e096982493c7f647c71bddb9a683ca84b923dc27c33"} Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.436603 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4203a06969a31053ae160e096982493c7f647c71bddb9a683ca84b923dc27c33" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.494864 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.545165 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-svc\") pod \"1fa02f8d-3656-4f83-8e33-01d053471999\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.545271 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w7t9\" (UniqueName: \"kubernetes.io/projected/1fa02f8d-3656-4f83-8e33-01d053471999-kube-api-access-9w7t9\") pod \"1fa02f8d-3656-4f83-8e33-01d053471999\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.545316 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-sb\") pod \"1fa02f8d-3656-4f83-8e33-01d053471999\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.545375 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-swift-storage-0\") pod \"1fa02f8d-3656-4f83-8e33-01d053471999\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.545414 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-config\") pod \"1fa02f8d-3656-4f83-8e33-01d053471999\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.545452 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-nb\") pod \"1fa02f8d-3656-4f83-8e33-01d053471999\" (UID: \"1fa02f8d-3656-4f83-8e33-01d053471999\") " Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.570675 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa02f8d-3656-4f83-8e33-01d053471999-kube-api-access-9w7t9" (OuterVolumeSpecName: "kube-api-access-9w7t9") pod "1fa02f8d-3656-4f83-8e33-01d053471999" (UID: "1fa02f8d-3656-4f83-8e33-01d053471999"). InnerVolumeSpecName "kube-api-access-9w7t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.600260 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fa02f8d-3656-4f83-8e33-01d053471999" (UID: "1fa02f8d-3656-4f83-8e33-01d053471999"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.604924 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fa02f8d-3656-4f83-8e33-01d053471999" (UID: "1fa02f8d-3656-4f83-8e33-01d053471999"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.616345 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-config" (OuterVolumeSpecName: "config") pod "1fa02f8d-3656-4f83-8e33-01d053471999" (UID: "1fa02f8d-3656-4f83-8e33-01d053471999"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.628283 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fa02f8d-3656-4f83-8e33-01d053471999" (UID: "1fa02f8d-3656-4f83-8e33-01d053471999"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.632676 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fa02f8d-3656-4f83-8e33-01d053471999" (UID: "1fa02f8d-3656-4f83-8e33-01d053471999"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.652544 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.652586 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.652598 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.652607 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.652618 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w7t9\" (UniqueName: \"kubernetes.io/projected/1fa02f8d-3656-4f83-8e33-01d053471999-kube-api-access-9w7t9\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:53 crc kubenswrapper[4907]: I1009 19:46:53.652630 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa02f8d-3656-4f83-8e33-01d053471999-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:54 crc kubenswrapper[4907]: I1009 19:46:54.449893 4907 generic.go:334] "Generic (PLEG): container finished" podID="ba292ca5-579c-4a89-b291-53bd3ef8d744" containerID="d740fb03ad983bd5492d0bdc1489013183df2e1232605cb012a7127005f6087a" exitCode=0 Oct 09 19:46:54 crc kubenswrapper[4907]: I1009 19:46:54.449980 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8h99z" event={"ID":"ba292ca5-579c-4a89-b291-53bd3ef8d744","Type":"ContainerDied","Data":"d740fb03ad983bd5492d0bdc1489013183df2e1232605cb012a7127005f6087a"} Oct 09 19:46:54 crc kubenswrapper[4907]: I1009 19:46:54.450226 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dw45f" Oct 09 19:46:54 crc kubenswrapper[4907]: I1009 19:46:54.498451 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dw45f"] Oct 09 19:46:54 crc kubenswrapper[4907]: I1009 19:46:54.517895 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dw45f"] Oct 09 19:46:54 crc kubenswrapper[4907]: I1009 19:46:54.614170 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:54 crc kubenswrapper[4907]: I1009 19:46:54.682858 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.170660 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa02f8d-3656-4f83-8e33-01d053471999" path="/var/lib/kubelet/pods/1fa02f8d-3656-4f83-8e33-01d053471999/volumes" Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.861874 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.909290 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-combined-ca-bundle\") pod \"ba292ca5-579c-4a89-b291-53bd3ef8d744\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.909347 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-scripts\") pod \"ba292ca5-579c-4a89-b291-53bd3ef8d744\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.909416 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-config-data\") pod \"ba292ca5-579c-4a89-b291-53bd3ef8d744\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.909532 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppp76\" (UniqueName: \"kubernetes.io/projected/ba292ca5-579c-4a89-b291-53bd3ef8d744-kube-api-access-ppp76\") pod \"ba292ca5-579c-4a89-b291-53bd3ef8d744\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.909560 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba292ca5-579c-4a89-b291-53bd3ef8d744-etc-machine-id\") pod \"ba292ca5-579c-4a89-b291-53bd3ef8d744\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.909615 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-db-sync-config-data\") pod \"ba292ca5-579c-4a89-b291-53bd3ef8d744\" (UID: \"ba292ca5-579c-4a89-b291-53bd3ef8d744\") " Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.910639 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba292ca5-579c-4a89-b291-53bd3ef8d744-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ba292ca5-579c-4a89-b291-53bd3ef8d744" (UID: "ba292ca5-579c-4a89-b291-53bd3ef8d744"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.916451 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba292ca5-579c-4a89-b291-53bd3ef8d744-kube-api-access-ppp76" (OuterVolumeSpecName: "kube-api-access-ppp76") pod "ba292ca5-579c-4a89-b291-53bd3ef8d744" (UID: "ba292ca5-579c-4a89-b291-53bd3ef8d744"). InnerVolumeSpecName "kube-api-access-ppp76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.916866 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ba292ca5-579c-4a89-b291-53bd3ef8d744" (UID: "ba292ca5-579c-4a89-b291-53bd3ef8d744"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.917453 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-scripts" (OuterVolumeSpecName: "scripts") pod "ba292ca5-579c-4a89-b291-53bd3ef8d744" (UID: "ba292ca5-579c-4a89-b291-53bd3ef8d744"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.950667 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba292ca5-579c-4a89-b291-53bd3ef8d744" (UID: "ba292ca5-579c-4a89-b291-53bd3ef8d744"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:55 crc kubenswrapper[4907]: I1009 19:46:55.963913 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-config-data" (OuterVolumeSpecName: "config-data") pod "ba292ca5-579c-4a89-b291-53bd3ef8d744" (UID: "ba292ca5-579c-4a89-b291-53bd3ef8d744"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.012140 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.012170 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.012179 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.012188 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppp76\" (UniqueName: \"kubernetes.io/projected/ba292ca5-579c-4a89-b291-53bd3ef8d744-kube-api-access-ppp76\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.012198 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba292ca5-579c-4a89-b291-53bd3ef8d744-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.012206 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba292ca5-579c-4a89-b291-53bd3ef8d744-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.196122 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-799f6b8dfc-hssd9" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.476983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8h99z" event={"ID":"ba292ca5-579c-4a89-b291-53bd3ef8d744","Type":"ContainerDied","Data":"07c9bc34105f59accf2e352ed76bf26090bdcd26d3ef763df6cc85a0e33574dd"} Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.477033 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07c9bc34105f59accf2e352ed76bf26090bdcd26d3ef763df6cc85a0e33574dd" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.477037 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8h99z" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790130 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:46:56 crc kubenswrapper[4907]: E1009 19:46:56.790500 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988dc8ef-dc2f-447c-8689-7a2e500eb773" containerName="dnsmasq-dns" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790516 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="988dc8ef-dc2f-447c-8689-7a2e500eb773" containerName="dnsmasq-dns" Oct 09 19:46:56 crc kubenswrapper[4907]: E1009 19:46:56.790532 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba292ca5-579c-4a89-b291-53bd3ef8d744" containerName="cinder-db-sync" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790538 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba292ca5-579c-4a89-b291-53bd3ef8d744" containerName="cinder-db-sync" Oct 09 19:46:56 crc kubenswrapper[4907]: E1009 19:46:56.790552 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988dc8ef-dc2f-447c-8689-7a2e500eb773" containerName="init" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790558 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="988dc8ef-dc2f-447c-8689-7a2e500eb773" containerName="init" Oct 09 19:46:56 crc kubenswrapper[4907]: E1009 19:46:56.790571 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa02f8d-3656-4f83-8e33-01d053471999" containerName="init" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790578 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa02f8d-3656-4f83-8e33-01d053471999" containerName="init" Oct 09 19:46:56 crc kubenswrapper[4907]: E1009 19:46:56.790591 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa02f8d-3656-4f83-8e33-01d053471999" containerName="dnsmasq-dns" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790599 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa02f8d-3656-4f83-8e33-01d053471999" containerName="dnsmasq-dns" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790828 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa02f8d-3656-4f83-8e33-01d053471999" containerName="dnsmasq-dns" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790846 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="988dc8ef-dc2f-447c-8689-7a2e500eb773" containerName="dnsmasq-dns" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.790859 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba292ca5-579c-4a89-b291-53bd3ef8d744" containerName="cinder-db-sync" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.791846 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.799533 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.799813 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lj7xz" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.799961 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.800080 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.823023 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.827765 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl9dd\" (UniqueName: \"kubernetes.io/projected/adb245d3-95d6-4701-b66d-549ae443b0be-kube-api-access-nl9dd\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.827821 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.827940 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-scripts\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.827981 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adb245d3-95d6-4701-b66d-549ae443b0be-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.828015 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.828065 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.865885 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78t58"] Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.868017 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.883739 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78t58"] Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931403 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931535 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-scripts\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931564 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adb245d3-95d6-4701-b66d-549ae443b0be-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931606 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adb245d3-95d6-4701-b66d-549ae443b0be-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931695 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931788 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931829 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-config\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931853 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-svc\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931879 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931940 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl9dd\" (UniqueName: \"kubernetes.io/projected/adb245d3-95d6-4701-b66d-549ae443b0be-kube-api-access-nl9dd\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931969 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.931998 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25csk\" (UniqueName: \"kubernetes.io/projected/a84c9813-0bcd-4c28-aa91-219dec410336-kube-api-access-25csk\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.938022 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-scripts\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.938081 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.941486 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.946120 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:56 crc kubenswrapper[4907]: I1009 19:46:56.971496 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl9dd\" (UniqueName: \"kubernetes.io/projected/adb245d3-95d6-4701-b66d-549ae443b0be-kube-api-access-nl9dd\") pod \"cinder-scheduler-0\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " pod="openstack/cinder-scheduler-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.034156 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.034293 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-config\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.034319 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-svc\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.034341 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.034391 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25csk\" (UniqueName: \"kubernetes.io/projected/a84c9813-0bcd-4c28-aa91-219dec410336-kube-api-access-25csk\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.035179 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.035177 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.035235 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-config\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.035258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-svc\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.035307 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.035851 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.079189 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25csk\" (UniqueName: \"kubernetes.io/projected/a84c9813-0bcd-4c28-aa91-219dec410336-kube-api-access-25csk\") pod \"dnsmasq-dns-5784cf869f-78t58\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.094366 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.096209 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.098587 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.104223 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.121274 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.136295 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-scripts\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.136637 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.136662 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c68d231-8125-4c47-adf2-66344ef91470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.136688 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c68d231-8125-4c47-adf2-66344ef91470-logs\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.136749 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2gn\" (UniqueName: \"kubernetes.io/projected/1c68d231-8125-4c47-adf2-66344ef91470-kube-api-access-6j2gn\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.136823 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.136844 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.221680 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.240407 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2gn\" (UniqueName: \"kubernetes.io/projected/1c68d231-8125-4c47-adf2-66344ef91470-kube-api-access-6j2gn\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.241747 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.241795 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.241983 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-scripts\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.242021 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.242059 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c68d231-8125-4c47-adf2-66344ef91470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.242123 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c68d231-8125-4c47-adf2-66344ef91470-logs\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.243053 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c68d231-8125-4c47-adf2-66344ef91470-logs\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.244828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c68d231-8125-4c47-adf2-66344ef91470-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.249377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.252957 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data-custom\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.255870 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-scripts\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.262836 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.275023 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2gn\" (UniqueName: \"kubernetes.io/projected/1c68d231-8125-4c47-adf2-66344ef91470-kube-api-access-6j2gn\") pod \"cinder-api-0\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.295240 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.489517 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cbc76cff8-xmlsw" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.544775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.549219 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-575bf6fcb-29gs7"] Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.549493 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-575bf6fcb-29gs7" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api-log" containerID="cri-o://b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377" gracePeriod=30 Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.549960 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-575bf6fcb-29gs7" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api" containerID="cri-o://cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353" gracePeriod=30 Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.756115 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:46:57 crc kubenswrapper[4907]: W1009 19:46:57.759687 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadb245d3_95d6_4701_b66d_549ae443b0be.slice/crio-6c30c0da22d485590921fffb33f36b20443d3544fd841f05c4ea9d42dcbba1b9 WatchSource:0}: Error finding container 6c30c0da22d485590921fffb33f36b20443d3544fd841f05c4ea9d42dcbba1b9: Status 404 returned error can't find the container with id 6c30c0da22d485590921fffb33f36b20443d3544fd841f05c4ea9d42dcbba1b9 Oct 09 19:46:57 crc kubenswrapper[4907]: I1009 19:46:57.834738 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78t58"] Oct 09 19:46:57 crc kubenswrapper[4907]: W1009 19:46:57.840770 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda84c9813_0bcd_4c28_aa91_219dec410336.slice/crio-f80f3b94efe5c17942c7c0332bc9a04b89df9172486489b5479a2b729e04240e WatchSource:0}: Error finding container f80f3b94efe5c17942c7c0332bc9a04b89df9172486489b5479a2b729e04240e: Status 404 returned error can't find the container with id f80f3b94efe5c17942c7c0332bc9a04b89df9172486489b5479a2b729e04240e Oct 09 19:46:58 crc kubenswrapper[4907]: I1009 19:46:58.080846 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:46:58 crc kubenswrapper[4907]: I1009 19:46:58.507522 4907 generic.go:334] "Generic (PLEG): container finished" podID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerID="b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377" exitCode=143 Oct 09 19:46:58 crc kubenswrapper[4907]: I1009 19:46:58.507589 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575bf6fcb-29gs7" event={"ID":"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87","Type":"ContainerDied","Data":"b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377"} Oct 09 19:46:58 crc kubenswrapper[4907]: I1009 19:46:58.509704 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"adb245d3-95d6-4701-b66d-549ae443b0be","Type":"ContainerStarted","Data":"6c30c0da22d485590921fffb33f36b20443d3544fd841f05c4ea9d42dcbba1b9"} Oct 09 19:46:58 crc kubenswrapper[4907]: I1009 19:46:58.511420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c68d231-8125-4c47-adf2-66344ef91470","Type":"ContainerStarted","Data":"17a613d8478e3431a00665e23fa702f25794a1c7386435d127a24b7dfdf1c8e0"} Oct 09 19:46:58 crc kubenswrapper[4907]: I1009 19:46:58.513586 4907 generic.go:334] "Generic (PLEG): container finished" podID="a84c9813-0bcd-4c28-aa91-219dec410336" containerID="bf18b249579a836b1b5c65c2dd0ced4c4d667f0d408d962d9de8b0c46771bd08" exitCode=0 Oct 09 19:46:58 crc kubenswrapper[4907]: I1009 19:46:58.513624 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78t58" event={"ID":"a84c9813-0bcd-4c28-aa91-219dec410336","Type":"ContainerDied","Data":"bf18b249579a836b1b5c65c2dd0ced4c4d667f0d408d962d9de8b0c46771bd08"} Oct 09 19:46:58 crc kubenswrapper[4907]: I1009 19:46:58.513647 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78t58" event={"ID":"a84c9813-0bcd-4c28-aa91-219dec410336","Type":"ContainerStarted","Data":"f80f3b94efe5c17942c7c0332bc9a04b89df9172486489b5479a2b729e04240e"} Oct 09 19:46:59 crc kubenswrapper[4907]: I1009 19:46:59.451052 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:46:59 crc kubenswrapper[4907]: I1009 19:46:59.527715 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"adb245d3-95d6-4701-b66d-549ae443b0be","Type":"ContainerStarted","Data":"3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152"} Oct 09 19:46:59 crc kubenswrapper[4907]: I1009 19:46:59.529457 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c68d231-8125-4c47-adf2-66344ef91470","Type":"ContainerStarted","Data":"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f"} Oct 09 19:46:59 crc kubenswrapper[4907]: I1009 19:46:59.540030 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78t58" event={"ID":"a84c9813-0bcd-4c28-aa91-219dec410336","Type":"ContainerStarted","Data":"fddbf440ebdeefbe0a53dcebd417d27aeb9d01507c8789be83fbde014e053576"} Oct 09 19:46:59 crc kubenswrapper[4907]: I1009 19:46:59.541696 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:46:59 crc kubenswrapper[4907]: I1009 19:46:59.570571 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-78t58" podStartSLOduration=3.570549245 podStartE2EDuration="3.570549245s" podCreationTimestamp="2025-10-09 19:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:46:59.562546179 +0000 UTC m=+1105.094513688" watchObservedRunningTime="2025-10-09 19:46:59.570549245 +0000 UTC m=+1105.102516734" Oct 09 19:46:59 crc kubenswrapper[4907]: I1009 19:46:59.960116 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.439834 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.441546 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.444546 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.444748 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.444975 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-p6sdh" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.450055 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.513909 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.515071 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.515266 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lr7d\" (UniqueName: \"kubernetes.io/projected/0f97d607-4cf4-4c31-85eb-462554b18b34-kube-api-access-4lr7d\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.515425 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.571600 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"adb245d3-95d6-4701-b66d-549ae443b0be","Type":"ContainerStarted","Data":"ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165"} Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.576607 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c68d231-8125-4c47-adf2-66344ef91470","Type":"ContainerStarted","Data":"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694"} Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.576864 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1c68d231-8125-4c47-adf2-66344ef91470" containerName="cinder-api-log" containerID="cri-o://980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f" gracePeriod=30 Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.576965 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.577003 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1c68d231-8125-4c47-adf2-66344ef91470" containerName="cinder-api" containerID="cri-o://749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694" gracePeriod=30 Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.615281 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.595426935 podStartE2EDuration="4.615261969s" podCreationTimestamp="2025-10-09 19:46:56 +0000 UTC" firstStartedPulling="2025-10-09 19:46:57.771682066 +0000 UTC m=+1103.303649555" lastFinishedPulling="2025-10-09 19:46:58.7915171 +0000 UTC m=+1104.323484589" observedRunningTime="2025-10-09 19:47:00.609294713 +0000 UTC m=+1106.141262232" watchObservedRunningTime="2025-10-09 19:47:00.615261969 +0000 UTC m=+1106.147229458" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.619012 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lr7d\" (UniqueName: \"kubernetes.io/projected/0f97d607-4cf4-4c31-85eb-462554b18b34-kube-api-access-4lr7d\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.619370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.619591 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.619724 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.620959 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.628553 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.629330 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.629310964 podStartE2EDuration="3.629310964s" podCreationTimestamp="2025-10-09 19:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:47:00.627767076 +0000 UTC m=+1106.159734575" watchObservedRunningTime="2025-10-09 19:47:00.629310964 +0000 UTC m=+1106.161278453" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.634723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.652243 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lr7d\" (UniqueName: \"kubernetes.io/projected/0f97d607-4cf4-4c31-85eb-462554b18b34-kube-api-access-4lr7d\") pod \"openstackclient\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.761076 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.767904 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-575bf6fcb-29gs7" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:42110->10.217.0.165:9311: read: connection reset by peer" Oct 09 19:47:00 crc kubenswrapper[4907]: I1009 19:47:00.767929 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-575bf6fcb-29gs7" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:42122->10.217.0.165:9311: read: connection reset by peer" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.207679 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.232477 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c68d231-8125-4c47-adf2-66344ef91470-etc-machine-id\") pod \"1c68d231-8125-4c47-adf2-66344ef91470\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.232545 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-scripts\") pod \"1c68d231-8125-4c47-adf2-66344ef91470\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.232590 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data\") pod \"1c68d231-8125-4c47-adf2-66344ef91470\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.232617 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data-custom\") pod \"1c68d231-8125-4c47-adf2-66344ef91470\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.232843 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j2gn\" (UniqueName: \"kubernetes.io/projected/1c68d231-8125-4c47-adf2-66344ef91470-kube-api-access-6j2gn\") pod \"1c68d231-8125-4c47-adf2-66344ef91470\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.232885 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c68d231-8125-4c47-adf2-66344ef91470-logs\") pod \"1c68d231-8125-4c47-adf2-66344ef91470\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.233926 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-combined-ca-bundle\") pod \"1c68d231-8125-4c47-adf2-66344ef91470\" (UID: \"1c68d231-8125-4c47-adf2-66344ef91470\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.233011 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c68d231-8125-4c47-adf2-66344ef91470-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1c68d231-8125-4c47-adf2-66344ef91470" (UID: "1c68d231-8125-4c47-adf2-66344ef91470"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.233698 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c68d231-8125-4c47-adf2-66344ef91470-logs" (OuterVolumeSpecName: "logs") pod "1c68d231-8125-4c47-adf2-66344ef91470" (UID: "1c68d231-8125-4c47-adf2-66344ef91470"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.238651 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c68d231-8125-4c47-adf2-66344ef91470-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.238671 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c68d231-8125-4c47-adf2-66344ef91470-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.239679 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c68d231-8125-4c47-adf2-66344ef91470" (UID: "1c68d231-8125-4c47-adf2-66344ef91470"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.246782 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c68d231-8125-4c47-adf2-66344ef91470-kube-api-access-6j2gn" (OuterVolumeSpecName: "kube-api-access-6j2gn") pod "1c68d231-8125-4c47-adf2-66344ef91470" (UID: "1c68d231-8125-4c47-adf2-66344ef91470"). InnerVolumeSpecName "kube-api-access-6j2gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.250798 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-scripts" (OuterVolumeSpecName: "scripts") pod "1c68d231-8125-4c47-adf2-66344ef91470" (UID: "1c68d231-8125-4c47-adf2-66344ef91470"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.269078 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c68d231-8125-4c47-adf2-66344ef91470" (UID: "1c68d231-8125-4c47-adf2-66344ef91470"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.276505 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.339717 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-combined-ca-bundle\") pod \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.339846 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-logs\") pod \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.339960 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data\") pod \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.340029 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldz4m\" (UniqueName: \"kubernetes.io/projected/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-kube-api-access-ldz4m\") pod \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.340057 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data-custom\") pod \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\" (UID: \"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87\") " Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.340411 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j2gn\" (UniqueName: \"kubernetes.io/projected/1c68d231-8125-4c47-adf2-66344ef91470-kube-api-access-6j2gn\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.340427 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.340436 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.340444 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.340864 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data" (OuterVolumeSpecName: "config-data") pod "1c68d231-8125-4c47-adf2-66344ef91470" (UID: "1c68d231-8125-4c47-adf2-66344ef91470"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.341145 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-logs" (OuterVolumeSpecName: "logs") pod "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" (UID: "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.344823 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" (UID: "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.347877 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-kube-api-access-ldz4m" (OuterVolumeSpecName: "kube-api-access-ldz4m") pod "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" (UID: "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87"). InnerVolumeSpecName "kube-api-access-ldz4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.370332 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" (UID: "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.406532 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data" (OuterVolumeSpecName: "config-data") pod "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" (UID: "8c8c3c56-1fea-44e4-b03f-0d54ac61ab87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.411619 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 19:47:01 crc kubenswrapper[4907]: W1009 19:47:01.417707 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f97d607_4cf4_4c31_85eb_462554b18b34.slice/crio-ce7c11d930678232e200a23f958ed98249acd1059f370a85416533e4e71af0bf WatchSource:0}: Error finding container ce7c11d930678232e200a23f958ed98249acd1059f370a85416533e4e71af0bf: Status 404 returned error can't find the container with id ce7c11d930678232e200a23f958ed98249acd1059f370a85416533e4e71af0bf Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.442568 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.442607 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldz4m\" (UniqueName: \"kubernetes.io/projected/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-kube-api-access-ldz4m\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.442621 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.442632 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.442642 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.442654 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c68d231-8125-4c47-adf2-66344ef91470-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.586867 4907 generic.go:334] "Generic (PLEG): container finished" podID="1c68d231-8125-4c47-adf2-66344ef91470" containerID="749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694" exitCode=0 Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.587201 4907 generic.go:334] "Generic (PLEG): container finished" podID="1c68d231-8125-4c47-adf2-66344ef91470" containerID="980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f" exitCode=143 Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.587000 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c68d231-8125-4c47-adf2-66344ef91470","Type":"ContainerDied","Data":"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694"} Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.587294 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c68d231-8125-4c47-adf2-66344ef91470","Type":"ContainerDied","Data":"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f"} Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.587313 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1c68d231-8125-4c47-adf2-66344ef91470","Type":"ContainerDied","Data":"17a613d8478e3431a00665e23fa702f25794a1c7386435d127a24b7dfdf1c8e0"} Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.587331 4907 scope.go:117] "RemoveContainer" containerID="749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.587073 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.589285 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0f97d607-4cf4-4c31-85eb-462554b18b34","Type":"ContainerStarted","Data":"ce7c11d930678232e200a23f958ed98249acd1059f370a85416533e4e71af0bf"} Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.594847 4907 generic.go:334] "Generic (PLEG): container finished" podID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerID="cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353" exitCode=0 Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.595103 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575bf6fcb-29gs7" event={"ID":"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87","Type":"ContainerDied","Data":"cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353"} Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.595155 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575bf6fcb-29gs7" event={"ID":"8c8c3c56-1fea-44e4-b03f-0d54ac61ab87","Type":"ContainerDied","Data":"9bb70c6a6ce20661378210ed0307ee1533bf54163482107aef2b0645ed5cba9b"} Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.595254 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575bf6fcb-29gs7" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.629119 4907 scope.go:117] "RemoveContainer" containerID="980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.639770 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-575bf6fcb-29gs7"] Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.660195 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-575bf6fcb-29gs7"] Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.672187 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.685162 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.695999 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:47:01 crc kubenswrapper[4907]: E1009 19:47:01.696413 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api-log" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.696425 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api-log" Oct 09 19:47:01 crc kubenswrapper[4907]: E1009 19:47:01.696437 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.696443 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api" Oct 09 19:47:01 crc kubenswrapper[4907]: E1009 19:47:01.696477 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c68d231-8125-4c47-adf2-66344ef91470" containerName="cinder-api" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.696485 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c68d231-8125-4c47-adf2-66344ef91470" containerName="cinder-api" Oct 09 19:47:01 crc kubenswrapper[4907]: E1009 19:47:01.696502 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c68d231-8125-4c47-adf2-66344ef91470" containerName="cinder-api-log" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.696508 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c68d231-8125-4c47-adf2-66344ef91470" containerName="cinder-api-log" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.696729 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api-log" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.696748 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c68d231-8125-4c47-adf2-66344ef91470" containerName="cinder-api-log" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.696757 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c68d231-8125-4c47-adf2-66344ef91470" containerName="cinder-api" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.696764 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" containerName="barbican-api" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.697707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.701433 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.701788 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.701987 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.703510 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.735394 4907 scope.go:117] "RemoveContainer" containerID="749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694" Oct 09 19:47:01 crc kubenswrapper[4907]: E1009 19:47:01.735852 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694\": container with ID starting with 749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694 not found: ID does not exist" containerID="749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.735905 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694"} err="failed to get container status \"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694\": rpc error: code = NotFound desc = could not find container \"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694\": container with ID starting with 749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694 not found: ID does not exist" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.735924 4907 scope.go:117] "RemoveContainer" containerID="980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f" Oct 09 19:47:01 crc kubenswrapper[4907]: E1009 19:47:01.736321 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f\": container with ID starting with 980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f not found: ID does not exist" containerID="980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.736359 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f"} err="failed to get container status \"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f\": rpc error: code = NotFound desc = could not find container \"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f\": container with ID starting with 980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f not found: ID does not exist" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.736385 4907 scope.go:117] "RemoveContainer" containerID="749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.736843 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694"} err="failed to get container status \"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694\": rpc error: code = NotFound desc = could not find container \"749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694\": container with ID starting with 749a62c16a32c4641556f2e448be7f6b63697692bc46e92afe4a807cd79ed694 not found: ID does not exist" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.736956 4907 scope.go:117] "RemoveContainer" containerID="980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.737228 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f"} err="failed to get container status \"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f\": rpc error: code = NotFound desc = could not find container \"980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f\": container with ID starting with 980ddd70328d13e2a12b87770a5b62d8152b0f680c58719f2f2f5193d542f92f not found: ID does not exist" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.737248 4907 scope.go:117] "RemoveContainer" containerID="cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.748674 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-config-data-custom\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.748833 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.748930 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-logs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.748955 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.749120 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l8md\" (UniqueName: \"kubernetes.io/projected/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-kube-api-access-5l8md\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.749188 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-config-data\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.749267 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-scripts\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.749301 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-public-tls-certs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.749317 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-etc-machine-id\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.756793 4907 scope.go:117] "RemoveContainer" containerID="b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.777818 4907 scope.go:117] "RemoveContainer" containerID="cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353" Oct 09 19:47:01 crc kubenswrapper[4907]: E1009 19:47:01.778211 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353\": container with ID starting with cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353 not found: ID does not exist" containerID="cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.778245 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353"} err="failed to get container status \"cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353\": rpc error: code = NotFound desc = could not find container \"cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353\": container with ID starting with cadfe0970b91c24628434d1988ec0a5aba6d39919e4286622d507cba085a4353 not found: ID does not exist" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.778274 4907 scope.go:117] "RemoveContainer" containerID="b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377" Oct 09 19:47:01 crc kubenswrapper[4907]: E1009 19:47:01.778948 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377\": container with ID starting with b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377 not found: ID does not exist" containerID="b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.778975 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377"} err="failed to get container status \"b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377\": rpc error: code = NotFound desc = could not find container \"b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377\": container with ID starting with b2fd1858937b10ca26e0b18954a2cd342688eff468661edfdae76bdca3fe4377 not found: ID does not exist" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.850980 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-logs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851040 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851135 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l8md\" (UniqueName: \"kubernetes.io/projected/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-kube-api-access-5l8md\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851184 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-config-data\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-scripts\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851271 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-public-tls-certs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851298 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-etc-machine-id\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851350 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-config-data-custom\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851413 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-logs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.851525 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-etc-machine-id\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.856568 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-public-tls-certs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.857798 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.858301 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-config-data\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.858708 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-scripts\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.860005 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-config-data-custom\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.870193 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:01 crc kubenswrapper[4907]: I1009 19:47:01.872452 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l8md\" (UniqueName: \"kubernetes.io/projected/727d6b58-cf41-4fe0-bf13-5a5a82fe2747-kube-api-access-5l8md\") pod \"cinder-api-0\" (UID: \"727d6b58-cf41-4fe0-bf13-5a5a82fe2747\") " pod="openstack/cinder-api-0" Oct 09 19:47:02 crc kubenswrapper[4907]: I1009 19:47:02.030749 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 19:47:02 crc kubenswrapper[4907]: I1009 19:47:02.122114 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 19:47:02 crc kubenswrapper[4907]: I1009 19:47:02.507268 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 19:47:02 crc kubenswrapper[4907]: W1009 19:47:02.525483 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod727d6b58_cf41_4fe0_bf13_5a5a82fe2747.slice/crio-ca2ba88cb6cf6739aabf13eddad2c60925e48eebfad975b6be6ccae8fa48d8fc WatchSource:0}: Error finding container ca2ba88cb6cf6739aabf13eddad2c60925e48eebfad975b6be6ccae8fa48d8fc: Status 404 returned error can't find the container with id ca2ba88cb6cf6739aabf13eddad2c60925e48eebfad975b6be6ccae8fa48d8fc Oct 09 19:47:02 crc kubenswrapper[4907]: I1009 19:47:02.615906 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"727d6b58-cf41-4fe0-bf13-5a5a82fe2747","Type":"ContainerStarted","Data":"ca2ba88cb6cf6739aabf13eddad2c60925e48eebfad975b6be6ccae8fa48d8fc"} Oct 09 19:47:03 crc kubenswrapper[4907]: I1009 19:47:03.164710 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c68d231-8125-4c47-adf2-66344ef91470" path="/var/lib/kubelet/pods/1c68d231-8125-4c47-adf2-66344ef91470/volumes" Oct 09 19:47:03 crc kubenswrapper[4907]: I1009 19:47:03.166004 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8c3c56-1fea-44e4-b03f-0d54ac61ab87" path="/var/lib/kubelet/pods/8c8c3c56-1fea-44e4-b03f-0d54ac61ab87/volumes" Oct 09 19:47:03 crc kubenswrapper[4907]: I1009 19:47:03.630350 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"727d6b58-cf41-4fe0-bf13-5a5a82fe2747","Type":"ContainerStarted","Data":"131e33e6374f50a8add23b5b85e78a2fd250c03c1c543ba39f321a5aa35483a1"} Oct 09 19:47:04 crc kubenswrapper[4907]: I1009 19:47:04.650792 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"727d6b58-cf41-4fe0-bf13-5a5a82fe2747","Type":"ContainerStarted","Data":"5b80b3fd6def4bce7bf879d3707caf2452c071e871b5944d6a64f079efbef801"} Oct 09 19:47:04 crc kubenswrapper[4907]: I1009 19:47:04.651224 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 19:47:04 crc kubenswrapper[4907]: I1009 19:47:04.677914 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.677895052 podStartE2EDuration="3.677895052s" podCreationTimestamp="2025-10-09 19:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:47:04.674186841 +0000 UTC m=+1110.206154340" watchObservedRunningTime="2025-10-09 19:47:04.677895052 +0000 UTC m=+1110.209862541" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.548617 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-78d77d8c99-ljb4q"] Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.550087 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.551820 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.559835 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.560235 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.590876 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78d77d8c99-ljb4q"] Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.642485 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-config-data\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.642548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-etc-swift\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.642620 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-public-tls-certs\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.642653 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-internal-tls-certs\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.642670 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2r9d\" (UniqueName: \"kubernetes.io/projected/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-kube-api-access-x2r9d\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.642686 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-combined-ca-bundle\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.642723 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-run-httpd\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.642756 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-log-httpd\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.743781 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-internal-tls-certs\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.743814 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2r9d\" (UniqueName: \"kubernetes.io/projected/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-kube-api-access-x2r9d\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.743837 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-combined-ca-bundle\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.743878 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-run-httpd\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.743924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-log-httpd\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.743969 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-config-data\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.744015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-etc-swift\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.744105 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-public-tls-certs\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.745760 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-run-httpd\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.747197 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-log-httpd\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.749872 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-internal-tls-certs\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.750346 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-config-data\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.758134 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-combined-ca-bundle\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.759297 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-public-tls-certs\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.759308 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-etc-swift\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.760865 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2r9d\" (UniqueName: \"kubernetes.io/projected/6b2c2269-5cb3-4bf8-a162-e6a11531eca4-kube-api-access-x2r9d\") pod \"swift-proxy-78d77d8c99-ljb4q\" (UID: \"6b2c2269-5cb3-4bf8-a162-e6a11531eca4\") " pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:05 crc kubenswrapper[4907]: I1009 19:47:05.887421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:06 crc kubenswrapper[4907]: I1009 19:47:06.299213 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:47:06 crc kubenswrapper[4907]: I1009 19:47:06.299285 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:47:06 crc kubenswrapper[4907]: I1009 19:47:06.688058 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7496575b44-md76p" Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.223679 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.296201 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-hcwrp"] Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.296602 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" podUID="a3160fd3-8937-418d-846b-47aff379cded" containerName="dnsmasq-dns" containerID="cri-o://85558c61f6f1dbef6e741983569b6116b67a31e8447e5a5554718fece782aa80" gracePeriod=10 Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.400093 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.452427 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.679233 4907 generic.go:334] "Generic (PLEG): container finished" podID="a3160fd3-8937-418d-846b-47aff379cded" containerID="85558c61f6f1dbef6e741983569b6116b67a31e8447e5a5554718fece782aa80" exitCode=0 Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.679370 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" event={"ID":"a3160fd3-8937-418d-846b-47aff379cded","Type":"ContainerDied","Data":"85558c61f6f1dbef6e741983569b6116b67a31e8447e5a5554718fece782aa80"} Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.679545 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" containerName="cinder-scheduler" containerID="cri-o://3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152" gracePeriod=30 Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.679621 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" containerName="probe" containerID="cri-o://ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165" gracePeriod=30 Oct 09 19:47:07 crc kubenswrapper[4907]: I1009 19:47:07.929012 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" podUID="a3160fd3-8937-418d-846b-47aff379cded" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.164:5353: connect: connection refused" Oct 09 19:47:08 crc kubenswrapper[4907]: I1009 19:47:08.685934 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-68f5dff589-gkl29" Oct 09 19:47:08 crc kubenswrapper[4907]: I1009 19:47:08.692298 4907 generic.go:334] "Generic (PLEG): container finished" podID="adb245d3-95d6-4701-b66d-549ae443b0be" containerID="ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165" exitCode=0 Oct 09 19:47:08 crc kubenswrapper[4907]: I1009 19:47:08.692360 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"adb245d3-95d6-4701-b66d-549ae443b0be","Type":"ContainerDied","Data":"ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165"} Oct 09 19:47:08 crc kubenswrapper[4907]: I1009 19:47:08.746059 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7496575b44-md76p"] Oct 09 19:47:08 crc kubenswrapper[4907]: I1009 19:47:08.746271 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7496575b44-md76p" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" containerName="neutron-api" containerID="cri-o://80a3619fa77a23367b745d811eeb278943eee9bb68e249c257b3d1ef99d38ed4" gracePeriod=30 Oct 09 19:47:08 crc kubenswrapper[4907]: I1009 19:47:08.746630 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7496575b44-md76p" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" containerName="neutron-httpd" containerID="cri-o://ee86b2e12031a669857531dfec780c7c87280b227afd9d3cd8fecabbce2e4e68" gracePeriod=30 Oct 09 19:47:09 crc kubenswrapper[4907]: I1009 19:47:09.704293 4907 generic.go:334] "Generic (PLEG): container finished" podID="16b9987f-f46c-4f23-851d-152c49a34fea" containerID="ee86b2e12031a669857531dfec780c7c87280b227afd9d3cd8fecabbce2e4e68" exitCode=0 Oct 09 19:47:09 crc kubenswrapper[4907]: I1009 19:47:09.704339 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7496575b44-md76p" event={"ID":"16b9987f-f46c-4f23-851d-152c49a34fea","Type":"ContainerDied","Data":"ee86b2e12031a669857531dfec780c7c87280b227afd9d3cd8fecabbce2e4e68"} Oct 09 19:47:10 crc kubenswrapper[4907]: I1009 19:47:10.716759 4907 generic.go:334] "Generic (PLEG): container finished" podID="16b9987f-f46c-4f23-851d-152c49a34fea" containerID="80a3619fa77a23367b745d811eeb278943eee9bb68e249c257b3d1ef99d38ed4" exitCode=0 Oct 09 19:47:10 crc kubenswrapper[4907]: I1009 19:47:10.717012 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7496575b44-md76p" event={"ID":"16b9987f-f46c-4f23-851d-152c49a34fea","Type":"ContainerDied","Data":"80a3619fa77a23367b745d811eeb278943eee9bb68e249c257b3d1ef99d38ed4"} Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.525876 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.662137 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.679738 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-svc\") pod \"a3160fd3-8937-418d-846b-47aff379cded\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.679835 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4k4\" (UniqueName: \"kubernetes.io/projected/a3160fd3-8937-418d-846b-47aff379cded-kube-api-access-4d4k4\") pod \"a3160fd3-8937-418d-846b-47aff379cded\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.679956 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-nb\") pod \"a3160fd3-8937-418d-846b-47aff379cded\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.679991 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-sb\") pod \"a3160fd3-8937-418d-846b-47aff379cded\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.680041 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-config\") pod \"a3160fd3-8937-418d-846b-47aff379cded\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.680070 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-swift-storage-0\") pod \"a3160fd3-8937-418d-846b-47aff379cded\" (UID: \"a3160fd3-8937-418d-846b-47aff379cded\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.687584 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3160fd3-8937-418d-846b-47aff379cded-kube-api-access-4d4k4" (OuterVolumeSpecName: "kube-api-access-4d4k4") pod "a3160fd3-8937-418d-846b-47aff379cded" (UID: "a3160fd3-8937-418d-846b-47aff379cded"). InnerVolumeSpecName "kube-api-access-4d4k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.739713 4907 generic.go:334] "Generic (PLEG): container finished" podID="adb245d3-95d6-4701-b66d-549ae443b0be" containerID="3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152" exitCode=0 Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.739778 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"adb245d3-95d6-4701-b66d-549ae443b0be","Type":"ContainerDied","Data":"3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152"} Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.739803 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"adb245d3-95d6-4701-b66d-549ae443b0be","Type":"ContainerDied","Data":"6c30c0da22d485590921fffb33f36b20443d3544fd841f05c4ea9d42dcbba1b9"} Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.739819 4907 scope.go:117] "RemoveContainer" containerID="ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.739932 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.745608 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" event={"ID":"a3160fd3-8937-418d-846b-47aff379cded","Type":"ContainerDied","Data":"3849cb78afda476625babf74eed45b6aa425295354a5250446112fdee5309edf"} Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.745662 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-hcwrp" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.748259 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0f97d607-4cf4-4c31-85eb-462554b18b34","Type":"ContainerStarted","Data":"377ede7bbce4a592098c5a202b7c32a4c9639ca9717fcb6cf82c131db21ce35f"} Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.761746 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3160fd3-8937-418d-846b-47aff379cded" (UID: "a3160fd3-8937-418d-846b-47aff379cded"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.762922 4907 scope.go:117] "RemoveContainer" containerID="3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.772693 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.838122112 podStartE2EDuration="11.772671745s" podCreationTimestamp="2025-10-09 19:47:00 +0000 UTC" firstStartedPulling="2025-10-09 19:47:01.41984877 +0000 UTC m=+1106.951816259" lastFinishedPulling="2025-10-09 19:47:11.354398403 +0000 UTC m=+1116.886365892" observedRunningTime="2025-10-09 19:47:11.770192474 +0000 UTC m=+1117.302159953" watchObservedRunningTime="2025-10-09 19:47:11.772671745 +0000 UTC m=+1117.304639234" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.773674 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3160fd3-8937-418d-846b-47aff379cded" (UID: "a3160fd3-8937-418d-846b-47aff379cded"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.778557 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3160fd3-8937-418d-846b-47aff379cded" (UID: "a3160fd3-8937-418d-846b-47aff379cded"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.782806 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data-custom\") pod \"adb245d3-95d6-4701-b66d-549ae443b0be\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.782919 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adb245d3-95d6-4701-b66d-549ae443b0be-etc-machine-id\") pod \"adb245d3-95d6-4701-b66d-549ae443b0be\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.782950 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl9dd\" (UniqueName: \"kubernetes.io/projected/adb245d3-95d6-4701-b66d-549ae443b0be-kube-api-access-nl9dd\") pod \"adb245d3-95d6-4701-b66d-549ae443b0be\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.783125 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-combined-ca-bundle\") pod \"adb245d3-95d6-4701-b66d-549ae443b0be\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.783146 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-scripts\") pod \"adb245d3-95d6-4701-b66d-549ae443b0be\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.783173 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data\") pod \"adb245d3-95d6-4701-b66d-549ae443b0be\" (UID: \"adb245d3-95d6-4701-b66d-549ae443b0be\") " Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.783187 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adb245d3-95d6-4701-b66d-549ae443b0be-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "adb245d3-95d6-4701-b66d-549ae443b0be" (UID: "adb245d3-95d6-4701-b66d-549ae443b0be"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.784756 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.784780 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4k4\" (UniqueName: \"kubernetes.io/projected/a3160fd3-8937-418d-846b-47aff379cded-kube-api-access-4d4k4\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.784791 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adb245d3-95d6-4701-b66d-549ae443b0be-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.784800 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.784808 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.787447 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-config" (OuterVolumeSpecName: "config") pod "a3160fd3-8937-418d-846b-47aff379cded" (UID: "a3160fd3-8937-418d-846b-47aff379cded"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.788277 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a3160fd3-8937-418d-846b-47aff379cded" (UID: "a3160fd3-8937-418d-846b-47aff379cded"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.789416 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-scripts" (OuterVolumeSpecName: "scripts") pod "adb245d3-95d6-4701-b66d-549ae443b0be" (UID: "adb245d3-95d6-4701-b66d-549ae443b0be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.790487 4907 scope.go:117] "RemoveContainer" containerID="ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.790504 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "adb245d3-95d6-4701-b66d-549ae443b0be" (UID: "adb245d3-95d6-4701-b66d-549ae443b0be"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.790654 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb245d3-95d6-4701-b66d-549ae443b0be-kube-api-access-nl9dd" (OuterVolumeSpecName: "kube-api-access-nl9dd") pod "adb245d3-95d6-4701-b66d-549ae443b0be" (UID: "adb245d3-95d6-4701-b66d-549ae443b0be"). InnerVolumeSpecName "kube-api-access-nl9dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: E1009 19:47:11.791279 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165\": container with ID starting with ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165 not found: ID does not exist" containerID="ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.791318 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165"} err="failed to get container status \"ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165\": rpc error: code = NotFound desc = could not find container \"ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165\": container with ID starting with ab896bdb8a7dfa7eb0f9e8d41919b128e75eedfa58e76d708c4262d30b8d3165 not found: ID does not exist" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.791344 4907 scope.go:117] "RemoveContainer" containerID="3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152" Oct 09 19:47:11 crc kubenswrapper[4907]: E1009 19:47:11.791864 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152\": container with ID starting with 3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152 not found: ID does not exist" containerID="3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.791890 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152"} err="failed to get container status \"3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152\": rpc error: code = NotFound desc = could not find container \"3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152\": container with ID starting with 3d7feea9482043de18b62e10772ef0b48f3b2fb38ec11c07f816e70558dea152 not found: ID does not exist" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.793626 4907 scope.go:117] "RemoveContainer" containerID="85558c61f6f1dbef6e741983569b6116b67a31e8447e5a5554718fece782aa80" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.827778 4907 scope.go:117] "RemoveContainer" containerID="ec5204cd930f394043964447e7f51893e57753364973377ae21df8788796a814" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.867660 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adb245d3-95d6-4701-b66d-549ae443b0be" (UID: "adb245d3-95d6-4701-b66d-549ae443b0be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.886037 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.886070 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.886080 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.886091 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl9dd\" (UniqueName: \"kubernetes.io/projected/adb245d3-95d6-4701-b66d-549ae443b0be-kube-api-access-nl9dd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.886101 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.886109 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3160fd3-8937-418d-846b-47aff379cded-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.887646 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data" (OuterVolumeSpecName: "config-data") pod "adb245d3-95d6-4701-b66d-549ae443b0be" (UID: "adb245d3-95d6-4701-b66d-549ae443b0be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.957287 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78d77d8c99-ljb4q"] Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.985180 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7496575b44-md76p" Oct 09 19:47:11 crc kubenswrapper[4907]: I1009 19:47:11.987523 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adb245d3-95d6-4701-b66d-549ae443b0be-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.089389 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sskbl\" (UniqueName: \"kubernetes.io/projected/16b9987f-f46c-4f23-851d-152c49a34fea-kube-api-access-sskbl\") pod \"16b9987f-f46c-4f23-851d-152c49a34fea\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.089512 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-httpd-config\") pod \"16b9987f-f46c-4f23-851d-152c49a34fea\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.089554 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-config\") pod \"16b9987f-f46c-4f23-851d-152c49a34fea\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.089613 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-combined-ca-bundle\") pod \"16b9987f-f46c-4f23-851d-152c49a34fea\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.089749 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-ovndb-tls-certs\") pod \"16b9987f-f46c-4f23-851d-152c49a34fea\" (UID: \"16b9987f-f46c-4f23-851d-152c49a34fea\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.096222 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "16b9987f-f46c-4f23-851d-152c49a34fea" (UID: "16b9987f-f46c-4f23-851d-152c49a34fea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.097371 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b9987f-f46c-4f23-851d-152c49a34fea-kube-api-access-sskbl" (OuterVolumeSpecName: "kube-api-access-sskbl") pod "16b9987f-f46c-4f23-851d-152c49a34fea" (UID: "16b9987f-f46c-4f23-851d-152c49a34fea"). InnerVolumeSpecName "kube-api-access-sskbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.100060 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-hcwrp"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.109585 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-hcwrp"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.158564 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.173611 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.193900 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sskbl\" (UniqueName: \"kubernetes.io/projected/16b9987f-f46c-4f23-851d-152c49a34fea-kube-api-access-sskbl\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.193947 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.199211 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.199772 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3160fd3-8937-418d-846b-47aff379cded" containerName="dnsmasq-dns" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.199790 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3160fd3-8937-418d-846b-47aff379cded" containerName="dnsmasq-dns" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.199801 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3160fd3-8937-418d-846b-47aff379cded" containerName="init" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.199807 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3160fd3-8937-418d-846b-47aff379cded" containerName="init" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.199829 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" containerName="cinder-scheduler" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.199835 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" containerName="cinder-scheduler" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.199844 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" containerName="probe" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.199849 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" containerName="probe" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.199865 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" containerName="neutron-api" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.199871 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" containerName="neutron-api" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.199882 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" containerName="neutron-httpd" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.199887 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" containerName="neutron-httpd" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.200046 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" containerName="neutron-api" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.200061 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" containerName="cinder-scheduler" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.200075 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" containerName="neutron-httpd" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.200086 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3160fd3-8937-418d-846b-47aff379cded" containerName="dnsmasq-dns" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.200248 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" containerName="probe" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.201344 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.203936 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.207086 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-config" (OuterVolumeSpecName: "config") pod "16b9987f-f46c-4f23-851d-152c49a34fea" (UID: "16b9987f-f46c-4f23-851d-152c49a34fea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.209174 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16b9987f-f46c-4f23-851d-152c49a34fea" (UID: "16b9987f-f46c-4f23-851d-152c49a34fea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.225664 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.238310 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "16b9987f-f46c-4f23-851d-152c49a34fea" (UID: "16b9987f-f46c-4f23-851d-152c49a34fea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300081 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc694291-2c4e-4bdf-b00c-4025d2018e96-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300151 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300223 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grvtx\" (UniqueName: \"kubernetes.io/projected/dc694291-2c4e-4bdf-b00c-4025d2018e96-kube-api-access-grvtx\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300328 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-scripts\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300410 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300456 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-config-data\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300534 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300550 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.300724 4907 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b9987f-f46c-4f23-851d-152c49a34fea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.404069 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc694291-2c4e-4bdf-b00c-4025d2018e96-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.404161 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.404259 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grvtx\" (UniqueName: \"kubernetes.io/projected/dc694291-2c4e-4bdf-b00c-4025d2018e96-kube-api-access-grvtx\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.404397 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-scripts\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.404480 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.404524 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-config-data\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.405594 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc694291-2c4e-4bdf-b00c-4025d2018e96-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.416604 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-config-data\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.417070 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-scripts\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.417425 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.417835 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc694291-2c4e-4bdf-b00c-4025d2018e96-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.478611 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grvtx\" (UniqueName: \"kubernetes.io/projected/dc694291-2c4e-4bdf-b00c-4025d2018e96-kube-api-access-grvtx\") pod \"cinder-scheduler-0\" (UID: \"dc694291-2c4e-4bdf-b00c-4025d2018e96\") " pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.481648 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6tdng"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.483221 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6tdng" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.494064 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6tdng"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.542560 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-r6m6d"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.543603 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r6m6d" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.554539 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-r6m6d"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.555035 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.630602 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv2dm\" (UniqueName: \"kubernetes.io/projected/3cf74d09-587e-410e-b450-e4d5206d4f55-kube-api-access-vv2dm\") pod \"3cf74d09-587e-410e-b450-e4d5206d4f55\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.630955 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-sg-core-conf-yaml\") pod \"3cf74d09-587e-410e-b450-e4d5206d4f55\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.630997 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-log-httpd\") pod \"3cf74d09-587e-410e-b450-e4d5206d4f55\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.631026 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-run-httpd\") pod \"3cf74d09-587e-410e-b450-e4d5206d4f55\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.631065 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-config-data\") pod \"3cf74d09-587e-410e-b450-e4d5206d4f55\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.631087 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-scripts\") pod \"3cf74d09-587e-410e-b450-e4d5206d4f55\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.631203 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-combined-ca-bundle\") pod \"3cf74d09-587e-410e-b450-e4d5206d4f55\" (UID: \"3cf74d09-587e-410e-b450-e4d5206d4f55\") " Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.631416 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6xf5\" (UniqueName: \"kubernetes.io/projected/0bd23d06-3e11-4a14-bddf-dd7e4782fd45-kube-api-access-m6xf5\") pod \"nova-api-db-create-6tdng\" (UID: \"0bd23d06-3e11-4a14-bddf-dd7e4782fd45\") " pod="openstack/nova-api-db-create-6tdng" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.631489 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kqwv\" (UniqueName: \"kubernetes.io/projected/07d623f5-216e-402f-aee1-e66c892b74f6-kube-api-access-2kqwv\") pod \"nova-cell0-db-create-r6m6d\" (UID: \"07d623f5-216e-402f-aee1-e66c892b74f6\") " pod="openstack/nova-cell0-db-create-r6m6d" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.631892 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3cf74d09-587e-410e-b450-e4d5206d4f55" (UID: "3cf74d09-587e-410e-b450-e4d5206d4f55"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.632395 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3cf74d09-587e-410e-b450-e4d5206d4f55" (UID: "3cf74d09-587e-410e-b450-e4d5206d4f55"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.635176 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.641448 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-scripts" (OuterVolumeSpecName: "scripts") pod "3cf74d09-587e-410e-b450-e4d5206d4f55" (UID: "3cf74d09-587e-410e-b450-e4d5206d4f55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.641350 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf74d09-587e-410e-b450-e4d5206d4f55-kube-api-access-vv2dm" (OuterVolumeSpecName: "kube-api-access-vv2dm") pod "3cf74d09-587e-410e-b450-e4d5206d4f55" (UID: "3cf74d09-587e-410e-b450-e4d5206d4f55"). InnerVolumeSpecName "kube-api-access-vv2dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.645603 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3cf74d09-587e-410e-b450-e4d5206d4f55" (UID: "3cf74d09-587e-410e-b450-e4d5206d4f55"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.648887 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-gp72n"] Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.649327 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="proxy-httpd" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.649345 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="proxy-httpd" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.649360 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="ceilometer-central-agent" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.649369 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="ceilometer-central-agent" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.649376 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="ceilometer-notification-agent" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.649384 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="ceilometer-notification-agent" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.649575 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="ceilometer-central-agent" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.649605 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="proxy-httpd" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.649617 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerName="ceilometer-notification-agent" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.650207 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gp72n" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.663429 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gp72n"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.695126 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.736494 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kqwv\" (UniqueName: \"kubernetes.io/projected/07d623f5-216e-402f-aee1-e66c892b74f6-kube-api-access-2kqwv\") pod \"nova-cell0-db-create-r6m6d\" (UID: \"07d623f5-216e-402f-aee1-e66c892b74f6\") " pod="openstack/nova-cell0-db-create-r6m6d" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.736699 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwfmg\" (UniqueName: \"kubernetes.io/projected/5c4dcdac-5087-4f48-ba67-c508cf316b2b-kube-api-access-gwfmg\") pod \"nova-cell1-db-create-gp72n\" (UID: \"5c4dcdac-5087-4f48-ba67-c508cf316b2b\") " pod="openstack/nova-cell1-db-create-gp72n" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.736886 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6xf5\" (UniqueName: \"kubernetes.io/projected/0bd23d06-3e11-4a14-bddf-dd7e4782fd45-kube-api-access-m6xf5\") pod \"nova-api-db-create-6tdng\" (UID: \"0bd23d06-3e11-4a14-bddf-dd7e4782fd45\") " pod="openstack/nova-api-db-create-6tdng" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.737009 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3cf74d09-587e-410e-b450-e4d5206d4f55-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.737086 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.737166 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv2dm\" (UniqueName: \"kubernetes.io/projected/3cf74d09-587e-410e-b450-e4d5206d4f55-kube-api-access-vv2dm\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.737240 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.752015 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6xf5\" (UniqueName: \"kubernetes.io/projected/0bd23d06-3e11-4a14-bddf-dd7e4782fd45-kube-api-access-m6xf5\") pod \"nova-api-db-create-6tdng\" (UID: \"0bd23d06-3e11-4a14-bddf-dd7e4782fd45\") " pod="openstack/nova-api-db-create-6tdng" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.755737 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kqwv\" (UniqueName: \"kubernetes.io/projected/07d623f5-216e-402f-aee1-e66c892b74f6-kube-api-access-2kqwv\") pod \"nova-cell0-db-create-r6m6d\" (UID: \"07d623f5-216e-402f-aee1-e66c892b74f6\") " pod="openstack/nova-cell0-db-create-r6m6d" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.772845 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cf74d09-587e-410e-b450-e4d5206d4f55" (UID: "3cf74d09-587e-410e-b450-e4d5206d4f55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.790836 4907 generic.go:334] "Generic (PLEG): container finished" podID="3cf74d09-587e-410e-b450-e4d5206d4f55" containerID="6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad" exitCode=137 Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.791066 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.791528 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cf74d09-587e-410e-b450-e4d5206d4f55","Type":"ContainerDied","Data":"6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad"} Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.791596 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3cf74d09-587e-410e-b450-e4d5206d4f55","Type":"ContainerDied","Data":"3aa8053d4c1c794976840941d3f8dd1df01300f4f30d921aa3b3aeb4ba0ab84e"} Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.791618 4907 scope.go:117] "RemoveContainer" containerID="6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.813445 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78d77d8c99-ljb4q" event={"ID":"6b2c2269-5cb3-4bf8-a162-e6a11531eca4","Type":"ContainerStarted","Data":"803ef6b9e8b4f1df1e86f48886db8bccab5f70286de60725ba687bd60e19d0a9"} Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.813505 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78d77d8c99-ljb4q" event={"ID":"6b2c2269-5cb3-4bf8-a162-e6a11531eca4","Type":"ContainerStarted","Data":"c44465b7c1ba62884c7f88059a3fdccedaab303d608c005e983f8cd1e89fdabd"} Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.813516 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78d77d8c99-ljb4q" event={"ID":"6b2c2269-5cb3-4bf8-a162-e6a11531eca4","Type":"ContainerStarted","Data":"82dd482a6bf39df4daba02362da736a1bb7de0a0d0bb554cc16a7f309e291908"} Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.813559 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.813581 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.821581 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7496575b44-md76p" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.821668 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7496575b44-md76p" event={"ID":"16b9987f-f46c-4f23-851d-152c49a34fea","Type":"ContainerDied","Data":"27be74c801b4f51b3ba3039da869ab69df8ebc251d7fc6c5644e4eebbfe95e2f"} Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.824769 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-config-data" (OuterVolumeSpecName: "config-data") pod "3cf74d09-587e-410e-b450-e4d5206d4f55" (UID: "3cf74d09-587e-410e-b450-e4d5206d4f55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.839632 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwfmg\" (UniqueName: \"kubernetes.io/projected/5c4dcdac-5087-4f48-ba67-c508cf316b2b-kube-api-access-gwfmg\") pod \"nova-cell1-db-create-gp72n\" (UID: \"5c4dcdac-5087-4f48-ba67-c508cf316b2b\") " pod="openstack/nova-cell1-db-create-gp72n" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.839755 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.839773 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf74d09-587e-410e-b450-e4d5206d4f55-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.865710 4907 scope.go:117] "RemoveContainer" containerID="e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.872981 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwfmg\" (UniqueName: \"kubernetes.io/projected/5c4dcdac-5087-4f48-ba67-c508cf316b2b-kube-api-access-gwfmg\") pod \"nova-cell1-db-create-gp72n\" (UID: \"5c4dcdac-5087-4f48-ba67-c508cf316b2b\") " pod="openstack/nova-cell1-db-create-gp72n" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.873069 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-78d77d8c99-ljb4q" podStartSLOduration=7.873045625 podStartE2EDuration="7.873045625s" podCreationTimestamp="2025-10-09 19:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:47:12.840370953 +0000 UTC m=+1118.372338442" watchObservedRunningTime="2025-10-09 19:47:12.873045625 +0000 UTC m=+1118.405013134" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.873357 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6tdng" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.883644 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7496575b44-md76p"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.884815 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r6m6d" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.902940 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7496575b44-md76p"] Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.922659 4907 scope.go:117] "RemoveContainer" containerID="e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.983901 4907 scope.go:117] "RemoveContainer" containerID="6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.987207 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad\": container with ID starting with 6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad not found: ID does not exist" containerID="6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.987243 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad"} err="failed to get container status \"6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad\": rpc error: code = NotFound desc = could not find container \"6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad\": container with ID starting with 6d260e16ff4530ff24ee50575bee5731ef86379ac53895caf33a93c15ae98dad not found: ID does not exist" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.987267 4907 scope.go:117] "RemoveContainer" containerID="e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.987600 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e\": container with ID starting with e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e not found: ID does not exist" containerID="e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.987636 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e"} err="failed to get container status \"e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e\": rpc error: code = NotFound desc = could not find container \"e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e\": container with ID starting with e50dfb4ccdb47b1a4f9b8c37ce0ed4661ccdacc915e86cdf2ccc361224cff98e not found: ID does not exist" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.987662 4907 scope.go:117] "RemoveContainer" containerID="e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020" Oct 09 19:47:12 crc kubenswrapper[4907]: E1009 19:47:12.987880 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020\": container with ID starting with e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020 not found: ID does not exist" containerID="e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.987903 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020"} err="failed to get container status \"e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020\": rpc error: code = NotFound desc = could not find container \"e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020\": container with ID starting with e032d4b59265d41a8eb25d7684089f874fd1de0873ecc89d5b2a767f26d44020 not found: ID does not exist" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.987916 4907 scope.go:117] "RemoveContainer" containerID="ee86b2e12031a669857531dfec780c7c87280b227afd9d3cd8fecabbce2e4e68" Oct 09 19:47:12 crc kubenswrapper[4907]: I1009 19:47:12.993892 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gp72n" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.040563 4907 scope.go:117] "RemoveContainer" containerID="80a3619fa77a23367b745d811eeb278943eee9bb68e249c257b3d1ef99d38ed4" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.171894 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b9987f-f46c-4f23-851d-152c49a34fea" path="/var/lib/kubelet/pods/16b9987f-f46c-4f23-851d-152c49a34fea/volumes" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.173021 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3160fd3-8937-418d-846b-47aff379cded" path="/var/lib/kubelet/pods/a3160fd3-8937-418d-846b-47aff379cded/volumes" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.174518 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb245d3-95d6-4701-b66d-549ae443b0be" path="/var/lib/kubelet/pods/adb245d3-95d6-4701-b66d-549ae443b0be/volumes" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.248529 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.333527 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.352275 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.354366 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.358048 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.378509 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.379073 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.407841 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:13 crc kubenswrapper[4907]: E1009 19:47:13.457657 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cf74d09_587e_410e_b450_e4d5206d4f55.slice\": RecentStats: unable to find data in memory cache]" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.464559 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-scripts\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.464649 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.465533 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-log-httpd\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.465661 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.465702 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg7wj\" (UniqueName: \"kubernetes.io/projected/81971130-976c-409f-b724-97b2840b5027-kube-api-access-lg7wj\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.465751 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-run-httpd\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.465843 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-config-data\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.553039 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6tdng"] Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.567412 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.567452 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg7wj\" (UniqueName: \"kubernetes.io/projected/81971130-976c-409f-b724-97b2840b5027-kube-api-access-lg7wj\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.567509 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-run-httpd\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.567529 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-config-data\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.567563 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-scripts\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.567592 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.567618 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-log-httpd\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.570020 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-log-httpd\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.571104 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-run-httpd\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.571829 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.573634 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-scripts\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.586489 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-config-data\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.587788 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.595133 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg7wj\" (UniqueName: \"kubernetes.io/projected/81971130-976c-409f-b724-97b2840b5027-kube-api-access-lg7wj\") pod \"ceilometer-0\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.597459 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-r6m6d"] Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.634036 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.634442 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.850056 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dc694291-2c4e-4bdf-b00c-4025d2018e96","Type":"ContainerStarted","Data":"94907b084c37059efc6abbec48164b642243b3fb026aff1215bca7e9eb5093de"} Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.887060 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6tdng" event={"ID":"0bd23d06-3e11-4a14-bddf-dd7e4782fd45","Type":"ContainerStarted","Data":"f92e078e8c6f559a6b798d9fa86c531d810bd18b3f7b0118d94d3885da33278a"} Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.892293 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r6m6d" event={"ID":"07d623f5-216e-402f-aee1-e66c892b74f6","Type":"ContainerStarted","Data":"27c9eace25d2fe425a2b3859f7b178d90041eba11a72980ab0439b12f7d08bcd"} Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.895550 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gp72n"] Oct 09 19:47:13 crc kubenswrapper[4907]: I1009 19:47:13.907296 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-6tdng" podStartSLOduration=1.907278542 podStartE2EDuration="1.907278542s" podCreationTimestamp="2025-10-09 19:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:47:13.901826048 +0000 UTC m=+1119.433793537" watchObservedRunningTime="2025-10-09 19:47:13.907278542 +0000 UTC m=+1119.439246031" Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.194509 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.831641 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.906246 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerStarted","Data":"5edae25cea52aec80dcb75480d2aa73c9cdcfa480d04eb3687d5c3d9dc487095"} Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.910445 4907 generic.go:334] "Generic (PLEG): container finished" podID="0bd23d06-3e11-4a14-bddf-dd7e4782fd45" containerID="85df2f84592e794839e3cf2e1bbad860966453e22698796a0a2d9acfbd5f24d6" exitCode=0 Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.910502 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6tdng" event={"ID":"0bd23d06-3e11-4a14-bddf-dd7e4782fd45","Type":"ContainerDied","Data":"85df2f84592e794839e3cf2e1bbad860966453e22698796a0a2d9acfbd5f24d6"} Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.912106 4907 generic.go:334] "Generic (PLEG): container finished" podID="07d623f5-216e-402f-aee1-e66c892b74f6" containerID="f0d2ffeb2382b39980ab23b8ace7986ea8a2e8c20beba1694867ef48bde67991" exitCode=0 Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.912148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r6m6d" event={"ID":"07d623f5-216e-402f-aee1-e66c892b74f6","Type":"ContainerDied","Data":"f0d2ffeb2382b39980ab23b8ace7986ea8a2e8c20beba1694867ef48bde67991"} Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.913201 4907 generic.go:334] "Generic (PLEG): container finished" podID="5c4dcdac-5087-4f48-ba67-c508cf316b2b" containerID="2c547b2367d46b709f3d18badcd8c0642243990baca3f07a08d1a87550e4bb08" exitCode=0 Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.913239 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gp72n" event={"ID":"5c4dcdac-5087-4f48-ba67-c508cf316b2b","Type":"ContainerDied","Data":"2c547b2367d46b709f3d18badcd8c0642243990baca3f07a08d1a87550e4bb08"} Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.913253 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gp72n" event={"ID":"5c4dcdac-5087-4f48-ba67-c508cf316b2b","Type":"ContainerStarted","Data":"1300d31cfd407202c85266e73c3020d175abca19e981a38dea29646756a53c33"} Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.915209 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dc694291-2c4e-4bdf-b00c-4025d2018e96","Type":"ContainerStarted","Data":"40efd8d7417d09bfd7d841ac07c91f77bd981be6b5fad2a56250e3df8e7c8a6d"} Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.915231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dc694291-2c4e-4bdf-b00c-4025d2018e96","Type":"ContainerStarted","Data":"dc1b0a3a7b7b9118892394ff4f3be91af00953b2b6dfa5f2d81058599ec90c9c"} Oct 09 19:47:14 crc kubenswrapper[4907]: I1009 19:47:14.963108 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.963089488 podStartE2EDuration="2.963089488s" podCreationTimestamp="2025-10-09 19:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:47:14.958055415 +0000 UTC m=+1120.490022914" watchObservedRunningTime="2025-10-09 19:47:14.963089488 +0000 UTC m=+1120.495056987" Oct 09 19:47:15 crc kubenswrapper[4907]: I1009 19:47:15.181142 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf74d09-587e-410e-b450-e4d5206d4f55" path="/var/lib/kubelet/pods/3cf74d09-587e-410e-b450-e4d5206d4f55/volumes" Oct 09 19:47:15 crc kubenswrapper[4907]: I1009 19:47:15.936829 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerStarted","Data":"b45a02c7c23d413379a28f4c4b0704e056c843b792023f7f8ff7c76e0d18e0b1"} Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.384070 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6tdng" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.435455 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6xf5\" (UniqueName: \"kubernetes.io/projected/0bd23d06-3e11-4a14-bddf-dd7e4782fd45-kube-api-access-m6xf5\") pod \"0bd23d06-3e11-4a14-bddf-dd7e4782fd45\" (UID: \"0bd23d06-3e11-4a14-bddf-dd7e4782fd45\") " Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.446745 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd23d06-3e11-4a14-bddf-dd7e4782fd45-kube-api-access-m6xf5" (OuterVolumeSpecName: "kube-api-access-m6xf5") pod "0bd23d06-3e11-4a14-bddf-dd7e4782fd45" (UID: "0bd23d06-3e11-4a14-bddf-dd7e4782fd45"). InnerVolumeSpecName "kube-api-access-m6xf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.525116 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r6m6d" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.530731 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gp72n" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.538132 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6xf5\" (UniqueName: \"kubernetes.io/projected/0bd23d06-3e11-4a14-bddf-dd7e4782fd45-kube-api-access-m6xf5\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.639112 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwfmg\" (UniqueName: \"kubernetes.io/projected/5c4dcdac-5087-4f48-ba67-c508cf316b2b-kube-api-access-gwfmg\") pod \"5c4dcdac-5087-4f48-ba67-c508cf316b2b\" (UID: \"5c4dcdac-5087-4f48-ba67-c508cf316b2b\") " Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.639335 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kqwv\" (UniqueName: \"kubernetes.io/projected/07d623f5-216e-402f-aee1-e66c892b74f6-kube-api-access-2kqwv\") pod \"07d623f5-216e-402f-aee1-e66c892b74f6\" (UID: \"07d623f5-216e-402f-aee1-e66c892b74f6\") " Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.643561 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d623f5-216e-402f-aee1-e66c892b74f6-kube-api-access-2kqwv" (OuterVolumeSpecName: "kube-api-access-2kqwv") pod "07d623f5-216e-402f-aee1-e66c892b74f6" (UID: "07d623f5-216e-402f-aee1-e66c892b74f6"). InnerVolumeSpecName "kube-api-access-2kqwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.644593 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4dcdac-5087-4f48-ba67-c508cf316b2b-kube-api-access-gwfmg" (OuterVolumeSpecName: "kube-api-access-gwfmg") pod "5c4dcdac-5087-4f48-ba67-c508cf316b2b" (UID: "5c4dcdac-5087-4f48-ba67-c508cf316b2b"). InnerVolumeSpecName "kube-api-access-gwfmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.741644 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kqwv\" (UniqueName: \"kubernetes.io/projected/07d623f5-216e-402f-aee1-e66c892b74f6-kube-api-access-2kqwv\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.741679 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwfmg\" (UniqueName: \"kubernetes.io/projected/5c4dcdac-5087-4f48-ba67-c508cf316b2b-kube-api-access-gwfmg\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.950915 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerStarted","Data":"52d0931da202d69ceeae756fb3b45eaee9a85e9492e44351c3ff4fc58cc9a5c4"} Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.951018 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerStarted","Data":"6ee50c24d9a124bae7a741b89c498d40024cf47cc2b51bdd99fc367056d6e6d2"} Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.954540 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6tdng" event={"ID":"0bd23d06-3e11-4a14-bddf-dd7e4782fd45","Type":"ContainerDied","Data":"f92e078e8c6f559a6b798d9fa86c531d810bd18b3f7b0118d94d3885da33278a"} Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.954572 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6tdng" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.954585 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92e078e8c6f559a6b798d9fa86c531d810bd18b3f7b0118d94d3885da33278a" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.956705 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-r6m6d" event={"ID":"07d623f5-216e-402f-aee1-e66c892b74f6","Type":"ContainerDied","Data":"27c9eace25d2fe425a2b3859f7b178d90041eba11a72980ab0439b12f7d08bcd"} Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.956735 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27c9eace25d2fe425a2b3859f7b178d90041eba11a72980ab0439b12f7d08bcd" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.956829 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-r6m6d" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.958532 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gp72n" event={"ID":"5c4dcdac-5087-4f48-ba67-c508cf316b2b","Type":"ContainerDied","Data":"1300d31cfd407202c85266e73c3020d175abca19e981a38dea29646756a53c33"} Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.958656 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1300d31cfd407202c85266e73c3020d175abca19e981a38dea29646756a53c33" Oct 09 19:47:16 crc kubenswrapper[4907]: I1009 19:47:16.958588 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gp72n" Oct 09 19:47:17 crc kubenswrapper[4907]: I1009 19:47:17.696025 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 19:47:18 crc kubenswrapper[4907]: I1009 19:47:18.980189 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerStarted","Data":"2792179a9c65ee25e01306e3a9d97502e63508b0743333011320dd77aab19e9a"} Oct 09 19:47:18 crc kubenswrapper[4907]: I1009 19:47:18.980570 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="ceilometer-central-agent" containerID="cri-o://b45a02c7c23d413379a28f4c4b0704e056c843b792023f7f8ff7c76e0d18e0b1" gracePeriod=30 Oct 09 19:47:18 crc kubenswrapper[4907]: I1009 19:47:18.980603 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 19:47:18 crc kubenswrapper[4907]: I1009 19:47:18.980712 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="sg-core" containerID="cri-o://52d0931da202d69ceeae756fb3b45eaee9a85e9492e44351c3ff4fc58cc9a5c4" gracePeriod=30 Oct 09 19:47:18 crc kubenswrapper[4907]: I1009 19:47:18.980713 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="proxy-httpd" containerID="cri-o://2792179a9c65ee25e01306e3a9d97502e63508b0743333011320dd77aab19e9a" gracePeriod=30 Oct 09 19:47:18 crc kubenswrapper[4907]: I1009 19:47:18.980774 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="ceilometer-notification-agent" containerID="cri-o://6ee50c24d9a124bae7a741b89c498d40024cf47cc2b51bdd99fc367056d6e6d2" gracePeriod=30 Oct 09 19:47:19 crc kubenswrapper[4907]: I1009 19:47:19.008751 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.784724281 podStartE2EDuration="6.008733955s" podCreationTimestamp="2025-10-09 19:47:13 +0000 UTC" firstStartedPulling="2025-10-09 19:47:14.210919342 +0000 UTC m=+1119.742886831" lastFinishedPulling="2025-10-09 19:47:18.434929006 +0000 UTC m=+1123.966896505" observedRunningTime="2025-10-09 19:47:19.004392578 +0000 UTC m=+1124.536360077" watchObservedRunningTime="2025-10-09 19:47:19.008733955 +0000 UTC m=+1124.540701444" Oct 09 19:47:19 crc kubenswrapper[4907]: I1009 19:47:19.991028 4907 generic.go:334] "Generic (PLEG): container finished" podID="81971130-976c-409f-b724-97b2840b5027" containerID="2792179a9c65ee25e01306e3a9d97502e63508b0743333011320dd77aab19e9a" exitCode=0 Oct 09 19:47:19 crc kubenswrapper[4907]: I1009 19:47:19.991358 4907 generic.go:334] "Generic (PLEG): container finished" podID="81971130-976c-409f-b724-97b2840b5027" containerID="52d0931da202d69ceeae756fb3b45eaee9a85e9492e44351c3ff4fc58cc9a5c4" exitCode=2 Oct 09 19:47:19 crc kubenswrapper[4907]: I1009 19:47:19.991372 4907 generic.go:334] "Generic (PLEG): container finished" podID="81971130-976c-409f-b724-97b2840b5027" containerID="6ee50c24d9a124bae7a741b89c498d40024cf47cc2b51bdd99fc367056d6e6d2" exitCode=0 Oct 09 19:47:19 crc kubenswrapper[4907]: I1009 19:47:19.991080 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerDied","Data":"2792179a9c65ee25e01306e3a9d97502e63508b0743333011320dd77aab19e9a"} Oct 09 19:47:19 crc kubenswrapper[4907]: I1009 19:47:19.991413 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerDied","Data":"52d0931da202d69ceeae756fb3b45eaee9a85e9492e44351c3ff4fc58cc9a5c4"} Oct 09 19:47:19 crc kubenswrapper[4907]: I1009 19:47:19.991431 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerDied","Data":"6ee50c24d9a124bae7a741b89c498d40024cf47cc2b51bdd99fc367056d6e6d2"} Oct 09 19:47:20 crc kubenswrapper[4907]: I1009 19:47:20.898282 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:20 crc kubenswrapper[4907]: I1009 19:47:20.901382 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78d77d8c99-ljb4q" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.003285 4907 generic.go:334] "Generic (PLEG): container finished" podID="81971130-976c-409f-b724-97b2840b5027" containerID="b45a02c7c23d413379a28f4c4b0704e056c843b792023f7f8ff7c76e0d18e0b1" exitCode=0 Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.003343 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerDied","Data":"b45a02c7c23d413379a28f4c4b0704e056c843b792023f7f8ff7c76e0d18e0b1"} Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.608218 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.731382 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-config-data\") pod \"81971130-976c-409f-b724-97b2840b5027\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.731496 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-log-httpd\") pod \"81971130-976c-409f-b724-97b2840b5027\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.731554 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-scripts\") pod \"81971130-976c-409f-b724-97b2840b5027\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.731578 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-combined-ca-bundle\") pod \"81971130-976c-409f-b724-97b2840b5027\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.731667 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-run-httpd\") pod \"81971130-976c-409f-b724-97b2840b5027\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.731720 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-sg-core-conf-yaml\") pod \"81971130-976c-409f-b724-97b2840b5027\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.731746 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg7wj\" (UniqueName: \"kubernetes.io/projected/81971130-976c-409f-b724-97b2840b5027-kube-api-access-lg7wj\") pod \"81971130-976c-409f-b724-97b2840b5027\" (UID: \"81971130-976c-409f-b724-97b2840b5027\") " Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.733134 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "81971130-976c-409f-b724-97b2840b5027" (UID: "81971130-976c-409f-b724-97b2840b5027"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.733663 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "81971130-976c-409f-b724-97b2840b5027" (UID: "81971130-976c-409f-b724-97b2840b5027"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.739574 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81971130-976c-409f-b724-97b2840b5027-kube-api-access-lg7wj" (OuterVolumeSpecName: "kube-api-access-lg7wj") pod "81971130-976c-409f-b724-97b2840b5027" (UID: "81971130-976c-409f-b724-97b2840b5027"). InnerVolumeSpecName "kube-api-access-lg7wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.750657 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-scripts" (OuterVolumeSpecName: "scripts") pod "81971130-976c-409f-b724-97b2840b5027" (UID: "81971130-976c-409f-b724-97b2840b5027"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.776688 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "81971130-976c-409f-b724-97b2840b5027" (UID: "81971130-976c-409f-b724-97b2840b5027"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.821025 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81971130-976c-409f-b724-97b2840b5027" (UID: "81971130-976c-409f-b724-97b2840b5027"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.834822 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.834850 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.834860 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.834870 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81971130-976c-409f-b724-97b2840b5027-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.834878 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.834887 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg7wj\" (UniqueName: \"kubernetes.io/projected/81971130-976c-409f-b724-97b2840b5027-kube-api-access-lg7wj\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.849448 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-config-data" (OuterVolumeSpecName: "config-data") pod "81971130-976c-409f-b724-97b2840b5027" (UID: "81971130-976c-409f-b724-97b2840b5027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:21 crc kubenswrapper[4907]: I1009 19:47:21.938171 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81971130-976c-409f-b724-97b2840b5027-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.014496 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81971130-976c-409f-b724-97b2840b5027","Type":"ContainerDied","Data":"5edae25cea52aec80dcb75480d2aa73c9cdcfa480d04eb3687d5c3d9dc487095"} Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.014555 4907 scope.go:117] "RemoveContainer" containerID="2792179a9c65ee25e01306e3a9d97502e63508b0743333011320dd77aab19e9a" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.014559 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.036781 4907 scope.go:117] "RemoveContainer" containerID="52d0931da202d69ceeae756fb3b45eaee9a85e9492e44351c3ff4fc58cc9a5c4" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.046620 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.053574 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.073260 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:22 crc kubenswrapper[4907]: E1009 19:47:22.076724 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="ceilometer-central-agent" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.076759 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="ceilometer-central-agent" Oct 09 19:47:22 crc kubenswrapper[4907]: E1009 19:47:22.076798 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="sg-core" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.076806 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="sg-core" Oct 09 19:47:22 crc kubenswrapper[4907]: E1009 19:47:22.076818 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4dcdac-5087-4f48-ba67-c508cf316b2b" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.076825 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4dcdac-5087-4f48-ba67-c508cf316b2b" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: E1009 19:47:22.076844 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d623f5-216e-402f-aee1-e66c892b74f6" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.076851 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d623f5-216e-402f-aee1-e66c892b74f6" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: E1009 19:47:22.076861 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="proxy-httpd" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.076867 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="proxy-httpd" Oct 09 19:47:22 crc kubenswrapper[4907]: E1009 19:47:22.076887 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd23d06-3e11-4a14-bddf-dd7e4782fd45" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.076894 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd23d06-3e11-4a14-bddf-dd7e4782fd45" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: E1009 19:47:22.076909 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="ceilometer-notification-agent" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.076916 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="ceilometer-notification-agent" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.077180 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="ceilometer-notification-agent" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.077199 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4dcdac-5087-4f48-ba67-c508cf316b2b" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.077221 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="ceilometer-central-agent" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.077236 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="proxy-httpd" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.077251 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bd23d06-3e11-4a14-bddf-dd7e4782fd45" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.077262 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d623f5-216e-402f-aee1-e66c892b74f6" containerName="mariadb-database-create" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.077273 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="81971130-976c-409f-b724-97b2840b5027" containerName="sg-core" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.079292 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.083528 4907 scope.go:117] "RemoveContainer" containerID="6ee50c24d9a124bae7a741b89c498d40024cf47cc2b51bdd99fc367056d6e6d2" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.083625 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.087704 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.106014 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.123939 4907 scope.go:117] "RemoveContainer" containerID="b45a02c7c23d413379a28f4c4b0704e056c843b792023f7f8ff7c76e0d18e0b1" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.150528 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-run-httpd\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.150593 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-log-httpd\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.150632 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-scripts\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.150761 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jc72\" (UniqueName: \"kubernetes.io/projected/1b4d1c6e-fe49-4334-8423-ab76a02c6620-kube-api-access-9jc72\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.150800 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.150856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.150898 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-config-data\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.252332 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-run-httpd\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.252756 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-log-httpd\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.252955 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-scripts\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.253217 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jc72\" (UniqueName: \"kubernetes.io/projected/1b4d1c6e-fe49-4334-8423-ab76a02c6620-kube-api-access-9jc72\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.253250 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.253274 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.253296 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-log-httpd\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.253302 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-config-data\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.252969 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-run-httpd\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.258090 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.258246 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.258811 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-scripts\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.259027 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-config-data\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.271333 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jc72\" (UniqueName: \"kubernetes.io/projected/1b4d1c6e-fe49-4334-8423-ab76a02c6620-kube-api-access-9jc72\") pod \"ceilometer-0\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.410697 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.659380 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-550c-account-create-m5bhx"] Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.661084 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-550c-account-create-m5bhx" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.668895 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-550c-account-create-m5bhx"] Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.669213 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.761975 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vgd5\" (UniqueName: \"kubernetes.io/projected/d6f15805-573f-4e5e-9897-9ba5f8f72d28-kube-api-access-5vgd5\") pod \"nova-api-550c-account-create-m5bhx\" (UID: \"d6f15805-573f-4e5e-9897-9ba5f8f72d28\") " pod="openstack/nova-api-550c-account-create-m5bhx" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.839947 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e625-account-create-44ktn"] Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.841041 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e625-account-create-44ktn" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.843554 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.855989 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e625-account-create-44ktn"] Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.863410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vgd5\" (UniqueName: \"kubernetes.io/projected/d6f15805-573f-4e5e-9897-9ba5f8f72d28-kube-api-access-5vgd5\") pod \"nova-api-550c-account-create-m5bhx\" (UID: \"d6f15805-573f-4e5e-9897-9ba5f8f72d28\") " pod="openstack/nova-api-550c-account-create-m5bhx" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.863558 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flssm\" (UniqueName: \"kubernetes.io/projected/26711161-fa32-4166-b646-1958af798b80-kube-api-access-flssm\") pod \"nova-cell0-e625-account-create-44ktn\" (UID: \"26711161-fa32-4166-b646-1958af798b80\") " pod="openstack/nova-cell0-e625-account-create-44ktn" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.892596 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vgd5\" (UniqueName: \"kubernetes.io/projected/d6f15805-573f-4e5e-9897-9ba5f8f72d28-kube-api-access-5vgd5\") pod \"nova-api-550c-account-create-m5bhx\" (UID: \"d6f15805-573f-4e5e-9897-9ba5f8f72d28\") " pod="openstack/nova-api-550c-account-create-m5bhx" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.928054 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.957040 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.965659 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flssm\" (UniqueName: \"kubernetes.io/projected/26711161-fa32-4166-b646-1958af798b80-kube-api-access-flssm\") pod \"nova-cell0-e625-account-create-44ktn\" (UID: \"26711161-fa32-4166-b646-1958af798b80\") " pod="openstack/nova-cell0-e625-account-create-44ktn" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.990292 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-550c-account-create-m5bhx" Oct 09 19:47:22 crc kubenswrapper[4907]: I1009 19:47:22.992406 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flssm\" (UniqueName: \"kubernetes.io/projected/26711161-fa32-4166-b646-1958af798b80-kube-api-access-flssm\") pod \"nova-cell0-e625-account-create-44ktn\" (UID: \"26711161-fa32-4166-b646-1958af798b80\") " pod="openstack/nova-cell0-e625-account-create-44ktn" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.047228 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerStarted","Data":"1d7a3e3a7dd10eb88ab99a277494c3ac6549a49d496cac7ecbf7f4a6a3d973bc"} Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.050964 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fc5a-account-create-fhcs9"] Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.052087 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fc5a-account-create-fhcs9" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.054590 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.063780 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fc5a-account-create-fhcs9"] Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.169348 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtt5b\" (UniqueName: \"kubernetes.io/projected/d250ac44-116b-4ab0-9ba9-1396ea38a602-kube-api-access-dtt5b\") pod \"nova-cell1-fc5a-account-create-fhcs9\" (UID: \"d250ac44-116b-4ab0-9ba9-1396ea38a602\") " pod="openstack/nova-cell1-fc5a-account-create-fhcs9" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.171143 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81971130-976c-409f-b724-97b2840b5027" path="/var/lib/kubelet/pods/81971130-976c-409f-b724-97b2840b5027/volumes" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.180539 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e625-account-create-44ktn" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.271400 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtt5b\" (UniqueName: \"kubernetes.io/projected/d250ac44-116b-4ab0-9ba9-1396ea38a602-kube-api-access-dtt5b\") pod \"nova-cell1-fc5a-account-create-fhcs9\" (UID: \"d250ac44-116b-4ab0-9ba9-1396ea38a602\") " pod="openstack/nova-cell1-fc5a-account-create-fhcs9" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.293142 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtt5b\" (UniqueName: \"kubernetes.io/projected/d250ac44-116b-4ab0-9ba9-1396ea38a602-kube-api-access-dtt5b\") pod \"nova-cell1-fc5a-account-create-fhcs9\" (UID: \"d250ac44-116b-4ab0-9ba9-1396ea38a602\") " pod="openstack/nova-cell1-fc5a-account-create-fhcs9" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.415763 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fc5a-account-create-fhcs9" Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.475249 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-550c-account-create-m5bhx"] Oct 09 19:47:23 crc kubenswrapper[4907]: W1009 19:47:23.486419 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6f15805_573f_4e5e_9897_9ba5f8f72d28.slice/crio-9d5b5ef2afaddb1e0b417c35259f8f1fe18f16e140c31d15cdaf18038a154926 WatchSource:0}: Error finding container 9d5b5ef2afaddb1e0b417c35259f8f1fe18f16e140c31d15cdaf18038a154926: Status 404 returned error can't find the container with id 9d5b5ef2afaddb1e0b417c35259f8f1fe18f16e140c31d15cdaf18038a154926 Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.728679 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e625-account-create-44ktn"] Oct 09 19:47:23 crc kubenswrapper[4907]: W1009 19:47:23.730005 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26711161_fa32_4166_b646_1958af798b80.slice/crio-1c3e9bc317800e2ca9532f19300307fd1aa2d23a3f789751d53ff89dbbce1de0 WatchSource:0}: Error finding container 1c3e9bc317800e2ca9532f19300307fd1aa2d23a3f789751d53ff89dbbce1de0: Status 404 returned error can't find the container with id 1c3e9bc317800e2ca9532f19300307fd1aa2d23a3f789751d53ff89dbbce1de0 Oct 09 19:47:23 crc kubenswrapper[4907]: I1009 19:47:23.964107 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fc5a-account-create-fhcs9"] Oct 09 19:47:24 crc kubenswrapper[4907]: I1009 19:47:24.070895 4907 generic.go:334] "Generic (PLEG): container finished" podID="d6f15805-573f-4e5e-9897-9ba5f8f72d28" containerID="b2a247449d1e5f790005794a741e6830839884d221a4363c738eaa97b9dfc968" exitCode=0 Oct 09 19:47:24 crc kubenswrapper[4907]: I1009 19:47:24.070957 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-550c-account-create-m5bhx" event={"ID":"d6f15805-573f-4e5e-9897-9ba5f8f72d28","Type":"ContainerDied","Data":"b2a247449d1e5f790005794a741e6830839884d221a4363c738eaa97b9dfc968"} Oct 09 19:47:24 crc kubenswrapper[4907]: I1009 19:47:24.073317 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-550c-account-create-m5bhx" event={"ID":"d6f15805-573f-4e5e-9897-9ba5f8f72d28","Type":"ContainerStarted","Data":"9d5b5ef2afaddb1e0b417c35259f8f1fe18f16e140c31d15cdaf18038a154926"} Oct 09 19:47:24 crc kubenswrapper[4907]: I1009 19:47:24.079885 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fc5a-account-create-fhcs9" event={"ID":"d250ac44-116b-4ab0-9ba9-1396ea38a602","Type":"ContainerStarted","Data":"f5ba628e08159fd39e6948b2977c18af7b5d858da2bb7ac6cf8fe680f57439a8"} Oct 09 19:47:24 crc kubenswrapper[4907]: I1009 19:47:24.086391 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerStarted","Data":"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69"} Oct 09 19:47:24 crc kubenswrapper[4907]: I1009 19:47:24.093833 4907 generic.go:334] "Generic (PLEG): container finished" podID="26711161-fa32-4166-b646-1958af798b80" containerID="e5a38ec84468a29c5eb5e65b35dcc6e955e84c40ab4adea53ce7879912bedc70" exitCode=0 Oct 09 19:47:24 crc kubenswrapper[4907]: I1009 19:47:24.093900 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e625-account-create-44ktn" event={"ID":"26711161-fa32-4166-b646-1958af798b80","Type":"ContainerDied","Data":"e5a38ec84468a29c5eb5e65b35dcc6e955e84c40ab4adea53ce7879912bedc70"} Oct 09 19:47:24 crc kubenswrapper[4907]: I1009 19:47:24.093945 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e625-account-create-44ktn" event={"ID":"26711161-fa32-4166-b646-1958af798b80","Type":"ContainerStarted","Data":"1c3e9bc317800e2ca9532f19300307fd1aa2d23a3f789751d53ff89dbbce1de0"} Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.110598 4907 generic.go:334] "Generic (PLEG): container finished" podID="d250ac44-116b-4ab0-9ba9-1396ea38a602" containerID="fcca4b1f90b430b4824ddb7322a68927b1229aa6edfb339ba2503f5eb026bc46" exitCode=0 Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.110638 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fc5a-account-create-fhcs9" event={"ID":"d250ac44-116b-4ab0-9ba9-1396ea38a602","Type":"ContainerDied","Data":"fcca4b1f90b430b4824ddb7322a68927b1229aa6edfb339ba2503f5eb026bc46"} Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.114704 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerStarted","Data":"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da"} Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.597213 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e625-account-create-44ktn" Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.603017 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-550c-account-create-m5bhx" Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.643598 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flssm\" (UniqueName: \"kubernetes.io/projected/26711161-fa32-4166-b646-1958af798b80-kube-api-access-flssm\") pod \"26711161-fa32-4166-b646-1958af798b80\" (UID: \"26711161-fa32-4166-b646-1958af798b80\") " Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.643664 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vgd5\" (UniqueName: \"kubernetes.io/projected/d6f15805-573f-4e5e-9897-9ba5f8f72d28-kube-api-access-5vgd5\") pod \"d6f15805-573f-4e5e-9897-9ba5f8f72d28\" (UID: \"d6f15805-573f-4e5e-9897-9ba5f8f72d28\") " Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.651026 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26711161-fa32-4166-b646-1958af798b80-kube-api-access-flssm" (OuterVolumeSpecName: "kube-api-access-flssm") pod "26711161-fa32-4166-b646-1958af798b80" (UID: "26711161-fa32-4166-b646-1958af798b80"). InnerVolumeSpecName "kube-api-access-flssm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.661894 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f15805-573f-4e5e-9897-9ba5f8f72d28-kube-api-access-5vgd5" (OuterVolumeSpecName: "kube-api-access-5vgd5") pod "d6f15805-573f-4e5e-9897-9ba5f8f72d28" (UID: "d6f15805-573f-4e5e-9897-9ba5f8f72d28"). InnerVolumeSpecName "kube-api-access-5vgd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.745169 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flssm\" (UniqueName: \"kubernetes.io/projected/26711161-fa32-4166-b646-1958af798b80-kube-api-access-flssm\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:25 crc kubenswrapper[4907]: I1009 19:47:25.745203 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vgd5\" (UniqueName: \"kubernetes.io/projected/d6f15805-573f-4e5e-9897-9ba5f8f72d28-kube-api-access-5vgd5\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.126975 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerStarted","Data":"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61"} Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.128973 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e625-account-create-44ktn" event={"ID":"26711161-fa32-4166-b646-1958af798b80","Type":"ContainerDied","Data":"1c3e9bc317800e2ca9532f19300307fd1aa2d23a3f789751d53ff89dbbce1de0"} Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.128994 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e625-account-create-44ktn" Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.128997 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c3e9bc317800e2ca9532f19300307fd1aa2d23a3f789751d53ff89dbbce1de0" Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.130780 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-550c-account-create-m5bhx" Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.130768 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-550c-account-create-m5bhx" event={"ID":"d6f15805-573f-4e5e-9897-9ba5f8f72d28","Type":"ContainerDied","Data":"9d5b5ef2afaddb1e0b417c35259f8f1fe18f16e140c31d15cdaf18038a154926"} Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.130934 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d5b5ef2afaddb1e0b417c35259f8f1fe18f16e140c31d15cdaf18038a154926" Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.403695 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fc5a-account-create-fhcs9" Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.456485 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtt5b\" (UniqueName: \"kubernetes.io/projected/d250ac44-116b-4ab0-9ba9-1396ea38a602-kube-api-access-dtt5b\") pod \"d250ac44-116b-4ab0-9ba9-1396ea38a602\" (UID: \"d250ac44-116b-4ab0-9ba9-1396ea38a602\") " Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.461419 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d250ac44-116b-4ab0-9ba9-1396ea38a602-kube-api-access-dtt5b" (OuterVolumeSpecName: "kube-api-access-dtt5b") pod "d250ac44-116b-4ab0-9ba9-1396ea38a602" (UID: "d250ac44-116b-4ab0-9ba9-1396ea38a602"). InnerVolumeSpecName "kube-api-access-dtt5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:26 crc kubenswrapper[4907]: I1009 19:47:26.566887 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtt5b\" (UniqueName: \"kubernetes.io/projected/d250ac44-116b-4ab0-9ba9-1396ea38a602-kube-api-access-dtt5b\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:27 crc kubenswrapper[4907]: I1009 19:47:27.150634 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fc5a-account-create-fhcs9" Oct 09 19:47:27 crc kubenswrapper[4907]: I1009 19:47:27.169323 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fc5a-account-create-fhcs9" event={"ID":"d250ac44-116b-4ab0-9ba9-1396ea38a602","Type":"ContainerDied","Data":"f5ba628e08159fd39e6948b2977c18af7b5d858da2bb7ac6cf8fe680f57439a8"} Oct 09 19:47:27 crc kubenswrapper[4907]: I1009 19:47:27.170007 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ba628e08159fd39e6948b2977c18af7b5d858da2bb7ac6cf8fe680f57439a8" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.145062 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5b5ql"] Oct 09 19:47:28 crc kubenswrapper[4907]: E1009 19:47:28.145508 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d250ac44-116b-4ab0-9ba9-1396ea38a602" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.145523 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d250ac44-116b-4ab0-9ba9-1396ea38a602" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: E1009 19:47:28.145552 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f15805-573f-4e5e-9897-9ba5f8f72d28" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.145559 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f15805-573f-4e5e-9897-9ba5f8f72d28" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: E1009 19:47:28.145577 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26711161-fa32-4166-b646-1958af798b80" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.145585 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="26711161-fa32-4166-b646-1958af798b80" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.145804 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="26711161-fa32-4166-b646-1958af798b80" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.145833 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f15805-573f-4e5e-9897-9ba5f8f72d28" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.145851 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d250ac44-116b-4ab0-9ba9-1396ea38a602" containerName="mariadb-account-create" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.148389 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.154252 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.156143 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7cbd9" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.156394 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.162862 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerStarted","Data":"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1"} Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.163242 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.165255 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5b5ql"] Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.197771 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.197869 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-scripts\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.198048 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-config-data\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.198084 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frx58\" (UniqueName: \"kubernetes.io/projected/7241797e-6008-4959-8314-f8100841d03c-kube-api-access-frx58\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.199517 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.148824477 podStartE2EDuration="6.199498737s" podCreationTimestamp="2025-10-09 19:47:22 +0000 UTC" firstStartedPulling="2025-10-09 19:47:22.938924879 +0000 UTC m=+1128.470892368" lastFinishedPulling="2025-10-09 19:47:26.989599129 +0000 UTC m=+1132.521566628" observedRunningTime="2025-10-09 19:47:28.194273959 +0000 UTC m=+1133.726241458" watchObservedRunningTime="2025-10-09 19:47:28.199498737 +0000 UTC m=+1133.731466226" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.299778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-scripts\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.299932 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-config-data\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.299962 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frx58\" (UniqueName: \"kubernetes.io/projected/7241797e-6008-4959-8314-f8100841d03c-kube-api-access-frx58\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.300015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.307062 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-scripts\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.307300 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-config-data\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.317523 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.323540 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frx58\" (UniqueName: \"kubernetes.io/projected/7241797e-6008-4959-8314-f8100841d03c-kube-api-access-frx58\") pod \"nova-cell0-conductor-db-sync-5b5ql\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.471861 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:28 crc kubenswrapper[4907]: I1009 19:47:28.953228 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5b5ql"] Oct 09 19:47:29 crc kubenswrapper[4907]: I1009 19:47:29.171756 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" event={"ID":"7241797e-6008-4959-8314-f8100841d03c","Type":"ContainerStarted","Data":"141b4985a3dc0d41a25e24caea1b2ad361ca1a0c97cb48ca35840941c19c8bd7"} Oct 09 19:47:29 crc kubenswrapper[4907]: I1009 19:47:29.213197 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:30 crc kubenswrapper[4907]: I1009 19:47:30.181985 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="ceilometer-central-agent" containerID="cri-o://11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69" gracePeriod=30 Oct 09 19:47:30 crc kubenswrapper[4907]: I1009 19:47:30.182440 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="proxy-httpd" containerID="cri-o://4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1" gracePeriod=30 Oct 09 19:47:30 crc kubenswrapper[4907]: I1009 19:47:30.182604 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="sg-core" containerID="cri-o://3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61" gracePeriod=30 Oct 09 19:47:30 crc kubenswrapper[4907]: I1009 19:47:30.182658 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="ceilometer-notification-agent" containerID="cri-o://9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da" gracePeriod=30 Oct 09 19:47:30 crc kubenswrapper[4907]: I1009 19:47:30.260110 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:47:30 crc kubenswrapper[4907]: I1009 19:47:30.260368 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerName="glance-log" containerID="cri-o://86d032acf14b35b8e72c6120c149a990a98c8ede352d3378c1e4ab843847125d" gracePeriod=30 Oct 09 19:47:30 crc kubenswrapper[4907]: I1009 19:47:30.260449 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerName="glance-httpd" containerID="cri-o://898f4310884167c2cbef6678a0d6f3533d2d231c3a78961160db814ca10a3361" gracePeriod=30 Oct 09 19:47:30 crc kubenswrapper[4907]: I1009 19:47:30.965417 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.070003 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-scripts\") pod \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.070111 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-config-data\") pod \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.070140 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-run-httpd\") pod \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.070169 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-combined-ca-bundle\") pod \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.070240 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-sg-core-conf-yaml\") pod \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.070281 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-log-httpd\") pod \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.070312 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jc72\" (UniqueName: \"kubernetes.io/projected/1b4d1c6e-fe49-4334-8423-ab76a02c6620-kube-api-access-9jc72\") pod \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\" (UID: \"1b4d1c6e-fe49-4334-8423-ab76a02c6620\") " Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.071384 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b4d1c6e-fe49-4334-8423-ab76a02c6620" (UID: "1b4d1c6e-fe49-4334-8423-ab76a02c6620"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.071603 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b4d1c6e-fe49-4334-8423-ab76a02c6620" (UID: "1b4d1c6e-fe49-4334-8423-ab76a02c6620"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.076563 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-scripts" (OuterVolumeSpecName: "scripts") pod "1b4d1c6e-fe49-4334-8423-ab76a02c6620" (UID: "1b4d1c6e-fe49-4334-8423-ab76a02c6620"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.076721 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4d1c6e-fe49-4334-8423-ab76a02c6620-kube-api-access-9jc72" (OuterVolumeSpecName: "kube-api-access-9jc72") pod "1b4d1c6e-fe49-4334-8423-ab76a02c6620" (UID: "1b4d1c6e-fe49-4334-8423-ab76a02c6620"). InnerVolumeSpecName "kube-api-access-9jc72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.098189 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b4d1c6e-fe49-4334-8423-ab76a02c6620" (UID: "1b4d1c6e-fe49-4334-8423-ab76a02c6620"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.143892 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b4d1c6e-fe49-4334-8423-ab76a02c6620" (UID: "1b4d1c6e-fe49-4334-8423-ab76a02c6620"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.166071 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-config-data" (OuterVolumeSpecName: "config-data") pod "1b4d1c6e-fe49-4334-8423-ab76a02c6620" (UID: "1b4d1c6e-fe49-4334-8423-ab76a02c6620"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.172346 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.172384 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.172399 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.172484 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.172498 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b4d1c6e-fe49-4334-8423-ab76a02c6620-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.172509 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jc72\" (UniqueName: \"kubernetes.io/projected/1b4d1c6e-fe49-4334-8423-ab76a02c6620-kube-api-access-9jc72\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.172519 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b4d1c6e-fe49-4334-8423-ab76a02c6620-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.194037 4907 generic.go:334] "Generic (PLEG): container finished" podID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerID="86d032acf14b35b8e72c6120c149a990a98c8ede352d3378c1e4ab843847125d" exitCode=143 Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.194116 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06","Type":"ContainerDied","Data":"86d032acf14b35b8e72c6120c149a990a98c8ede352d3378c1e4ab843847125d"} Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198175 4907 generic.go:334] "Generic (PLEG): container finished" podID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerID="4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1" exitCode=0 Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198205 4907 generic.go:334] "Generic (PLEG): container finished" podID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerID="3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61" exitCode=2 Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198216 4907 generic.go:334] "Generic (PLEG): container finished" podID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerID="9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da" exitCode=0 Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198230 4907 generic.go:334] "Generic (PLEG): container finished" podID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerID="11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69" exitCode=0 Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198240 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198226 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerDied","Data":"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1"} Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198284 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerDied","Data":"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61"} Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198303 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerDied","Data":"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da"} Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198321 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerDied","Data":"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69"} Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198340 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b4d1c6e-fe49-4334-8423-ab76a02c6620","Type":"ContainerDied","Data":"1d7a3e3a7dd10eb88ab99a277494c3ac6549a49d496cac7ecbf7f4a6a3d973bc"} Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.198361 4907 scope.go:117] "RemoveContainer" containerID="4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.230227 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.231483 4907 scope.go:117] "RemoveContainer" containerID="3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.237388 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.252999 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:31 crc kubenswrapper[4907]: E1009 19:47:31.253500 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="ceilometer-central-agent" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.253522 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="ceilometer-central-agent" Oct 09 19:47:31 crc kubenswrapper[4907]: E1009 19:47:31.253533 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="sg-core" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.253540 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="sg-core" Oct 09 19:47:31 crc kubenswrapper[4907]: E1009 19:47:31.253559 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="proxy-httpd" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.253567 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="proxy-httpd" Oct 09 19:47:31 crc kubenswrapper[4907]: E1009 19:47:31.253581 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="ceilometer-notification-agent" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.253590 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="ceilometer-notification-agent" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.254754 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="ceilometer-notification-agent" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.254782 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="sg-core" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.254795 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="ceilometer-central-agent" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.254807 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" containerName="proxy-httpd" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.256530 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.258688 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.259416 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.262458 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.281319 4907 scope.go:117] "RemoveContainer" containerID="9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.306772 4907 scope.go:117] "RemoveContainer" containerID="11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.383969 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv49c\" (UniqueName: \"kubernetes.io/projected/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-kube-api-access-rv49c\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.384038 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-log-httpd\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.384072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.384087 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.384120 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-config-data\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.384309 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-run-httpd\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.384533 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-scripts\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.385699 4907 scope.go:117] "RemoveContainer" containerID="4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1" Oct 09 19:47:31 crc kubenswrapper[4907]: E1009 19:47:31.386355 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": container with ID starting with 4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1 not found: ID does not exist" containerID="4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.386409 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1"} err="failed to get container status \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": rpc error: code = NotFound desc = could not find container \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": container with ID starting with 4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.386442 4907 scope.go:117] "RemoveContainer" containerID="3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61" Oct 09 19:47:31 crc kubenswrapper[4907]: E1009 19:47:31.387699 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": container with ID starting with 3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61 not found: ID does not exist" containerID="3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.387736 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61"} err="failed to get container status \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": rpc error: code = NotFound desc = could not find container \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": container with ID starting with 3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.387762 4907 scope.go:117] "RemoveContainer" containerID="9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da" Oct 09 19:47:31 crc kubenswrapper[4907]: E1009 19:47:31.388103 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": container with ID starting with 9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da not found: ID does not exist" containerID="9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.388151 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da"} err="failed to get container status \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": rpc error: code = NotFound desc = could not find container \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": container with ID starting with 9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.388177 4907 scope.go:117] "RemoveContainer" containerID="11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69" Oct 09 19:47:31 crc kubenswrapper[4907]: E1009 19:47:31.388655 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": container with ID starting with 11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69 not found: ID does not exist" containerID="11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.388919 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69"} err="failed to get container status \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": rpc error: code = NotFound desc = could not find container \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": container with ID starting with 11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.388956 4907 scope.go:117] "RemoveContainer" containerID="4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.389403 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1"} err="failed to get container status \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": rpc error: code = NotFound desc = could not find container \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": container with ID starting with 4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.389441 4907 scope.go:117] "RemoveContainer" containerID="3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.389823 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61"} err="failed to get container status \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": rpc error: code = NotFound desc = could not find container \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": container with ID starting with 3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.389872 4907 scope.go:117] "RemoveContainer" containerID="9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.390305 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da"} err="failed to get container status \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": rpc error: code = NotFound desc = could not find container \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": container with ID starting with 9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.390354 4907 scope.go:117] "RemoveContainer" containerID="11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.390694 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69"} err="failed to get container status \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": rpc error: code = NotFound desc = could not find container \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": container with ID starting with 11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.390719 4907 scope.go:117] "RemoveContainer" containerID="4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.390962 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1"} err="failed to get container status \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": rpc error: code = NotFound desc = could not find container \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": container with ID starting with 4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.390988 4907 scope.go:117] "RemoveContainer" containerID="3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.391276 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61"} err="failed to get container status \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": rpc error: code = NotFound desc = could not find container \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": container with ID starting with 3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.391329 4907 scope.go:117] "RemoveContainer" containerID="9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.391792 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da"} err="failed to get container status \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": rpc error: code = NotFound desc = could not find container \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": container with ID starting with 9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.391820 4907 scope.go:117] "RemoveContainer" containerID="11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.392048 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69"} err="failed to get container status \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": rpc error: code = NotFound desc = could not find container \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": container with ID starting with 11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.392075 4907 scope.go:117] "RemoveContainer" containerID="4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.392657 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1"} err="failed to get container status \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": rpc error: code = NotFound desc = could not find container \"4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1\": container with ID starting with 4d72225f77857262d6f159e76883e092e1b4795948932ae4ee2d829ca6aa48e1 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.392686 4907 scope.go:117] "RemoveContainer" containerID="3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.393006 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61"} err="failed to get container status \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": rpc error: code = NotFound desc = could not find container \"3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61\": container with ID starting with 3994f8bdded4acc9a8800bc09961ab91e451cb48c9b57e5a5ee241f110f42e61 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.393044 4907 scope.go:117] "RemoveContainer" containerID="9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.393422 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da"} err="failed to get container status \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": rpc error: code = NotFound desc = could not find container \"9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da\": container with ID starting with 9885de1a4eebdba6cc165edff205e142655cd4ed2c19dd0fd69c53c8913bf8da not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.393451 4907 scope.go:117] "RemoveContainer" containerID="11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.393707 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69"} err="failed to get container status \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": rpc error: code = NotFound desc = could not find container \"11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69\": container with ID starting with 11410711ead21daa8e63f920680a8a5529484bee3e91f36d1dca07bbceba9b69 not found: ID does not exist" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.486715 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-run-httpd\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.486835 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-scripts\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.486877 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv49c\" (UniqueName: \"kubernetes.io/projected/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-kube-api-access-rv49c\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.487008 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-log-httpd\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.487045 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.487064 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.487106 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-config-data\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.488132 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-run-httpd\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.488747 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-log-httpd\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.491847 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-scripts\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.492381 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.492774 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-config-data\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.493295 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.504950 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv49c\" (UniqueName: \"kubernetes.io/projected/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-kube-api-access-rv49c\") pod \"ceilometer-0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " pod="openstack/ceilometer-0" Oct 09 19:47:31 crc kubenswrapper[4907]: I1009 19:47:31.592074 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:32 crc kubenswrapper[4907]: I1009 19:47:32.044834 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:32 crc kubenswrapper[4907]: I1009 19:47:32.115215 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:47:32 crc kubenswrapper[4907]: I1009 19:47:32.115501 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerName="glance-log" containerID="cri-o://c5bc7122d48b7c9b4bc650e0224ec9c293ab19affb07db61a86db0c52367167b" gracePeriod=30 Oct 09 19:47:32 crc kubenswrapper[4907]: I1009 19:47:32.115584 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerName="glance-httpd" containerID="cri-o://6de742bdf70c0ce344dd3a058bb640a743aabfdefdf1c888528251a3b365dc53" gracePeriod=30 Oct 09 19:47:32 crc kubenswrapper[4907]: I1009 19:47:32.688493 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:33 crc kubenswrapper[4907]: I1009 19:47:33.162517 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b4d1c6e-fe49-4334-8423-ab76a02c6620" path="/var/lib/kubelet/pods/1b4d1c6e-fe49-4334-8423-ab76a02c6620/volumes" Oct 09 19:47:33 crc kubenswrapper[4907]: I1009 19:47:33.223246 4907 generic.go:334] "Generic (PLEG): container finished" podID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerID="c5bc7122d48b7c9b4bc650e0224ec9c293ab19affb07db61a86db0c52367167b" exitCode=143 Oct 09 19:47:33 crc kubenswrapper[4907]: I1009 19:47:33.223287 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a","Type":"ContainerDied","Data":"c5bc7122d48b7c9b4bc650e0224ec9c293ab19affb07db61a86db0c52367167b"} Oct 09 19:47:34 crc kubenswrapper[4907]: I1009 19:47:34.236149 4907 generic.go:334] "Generic (PLEG): container finished" podID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerID="898f4310884167c2cbef6678a0d6f3533d2d231c3a78961160db814ca10a3361" exitCode=0 Oct 09 19:47:34 crc kubenswrapper[4907]: I1009 19:47:34.236491 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06","Type":"ContainerDied","Data":"898f4310884167c2cbef6678a0d6f3533d2d231c3a78961160db814ca10a3361"} Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.287216 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerStarted","Data":"bc3afcd9efa8d763a0a0eb7fca76cae586f4c44a6298ba3cf3f62506ba862fe8"} Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.290067 4907 generic.go:334] "Generic (PLEG): container finished" podID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerID="6de742bdf70c0ce344dd3a058bb640a743aabfdefdf1c888528251a3b365dc53" exitCode=0 Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.290103 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a","Type":"ContainerDied","Data":"6de742bdf70c0ce344dd3a058bb640a743aabfdefdf1c888528251a3b365dc53"} Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.334871 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.334926 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.334967 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.336006 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9461f8fa6da50e0e37d8d2c88aee594214386d7c074bf0b7db5d5d79f7d078a8"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.336097 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://9461f8fa6da50e0e37d8d2c88aee594214386d7c074bf0b7db5d5d79f7d078a8" gracePeriod=600 Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.779999 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.788236 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.909331 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhpzt\" (UniqueName: \"kubernetes.io/projected/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-kube-api-access-fhpzt\") pod \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.909751 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5drr2\" (UniqueName: \"kubernetes.io/projected/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-kube-api-access-5drr2\") pod \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.909796 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-public-tls-certs\") pod \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.909840 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-logs\") pod \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.909865 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.909887 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.909941 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-httpd-run\") pod \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.909977 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-combined-ca-bundle\") pod \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910002 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-combined-ca-bundle\") pod \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910058 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-config-data\") pod \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910090 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-internal-tls-certs\") pod \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910122 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-httpd-run\") pod \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910143 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-config-data\") pod \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910161 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-scripts\") pod \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910220 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-scripts\") pod \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\" (UID: \"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910259 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-logs\") pod \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\" (UID: \"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06\") " Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.910887 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-logs" (OuterVolumeSpecName: "logs") pod "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" (UID: "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.911272 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-logs" (OuterVolumeSpecName: "logs") pod "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" (UID: "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.911897 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" (UID: "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.915677 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-kube-api-access-fhpzt" (OuterVolumeSpecName: "kube-api-access-fhpzt") pod "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" (UID: "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06"). InnerVolumeSpecName "kube-api-access-fhpzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.917282 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" (UID: "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.917833 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-scripts" (OuterVolumeSpecName: "scripts") pod "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" (UID: "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.918047 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-kube-api-access-5drr2" (OuterVolumeSpecName: "kube-api-access-5drr2") pod "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" (UID: "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a"). InnerVolumeSpecName "kube-api-access-5drr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.922372 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" (UID: "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.925705 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" (UID: "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 19:47:36 crc kubenswrapper[4907]: I1009 19:47:36.944279 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-scripts" (OuterVolumeSpecName: "scripts") pod "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" (UID: "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012807 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012844 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012854 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhpzt\" (UniqueName: \"kubernetes.io/projected/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-kube-api-access-fhpzt\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012864 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5drr2\" (UniqueName: \"kubernetes.io/projected/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-kube-api-access-5drr2\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012873 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012901 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012914 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012922 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012931 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.012939 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.062772 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.079358 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.080350 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" (UID: "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.097136 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" (UID: "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.107079 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" (UID: "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.110652 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-config-data" (OuterVolumeSpecName: "config-data") pod "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" (UID: "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.111030 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" (UID: "b076f1dc-3d4e-4be1-96d6-6e0a8229ff06"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.114886 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.114917 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.114927 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.114936 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.114945 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.114953 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.114963 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.126979 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-config-data" (OuterVolumeSpecName: "config-data") pod "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" (UID: "7ef1b7f2-599e-4002-bbe9-a75cbfa1091a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.216437 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.301001 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b076f1dc-3d4e-4be1-96d6-6e0a8229ff06","Type":"ContainerDied","Data":"d16348952eb3a0bcde945ff3096e331541f6e438e75144727424386aaf54be59"} Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.301052 4907 scope.go:117] "RemoveContainer" containerID="898f4310884167c2cbef6678a0d6f3533d2d231c3a78961160db814ca10a3361" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.301095 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.304384 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" event={"ID":"7241797e-6008-4959-8314-f8100841d03c","Type":"ContainerStarted","Data":"6c205bb8dd05ab038799b5a521785fff3a16b781e2a2f9d8aa7e868cba6b1a70"} Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.319135 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="9461f8fa6da50e0e37d8d2c88aee594214386d7c074bf0b7db5d5d79f7d078a8" exitCode=0 Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.319175 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"9461f8fa6da50e0e37d8d2c88aee594214386d7c074bf0b7db5d5d79f7d078a8"} Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.319254 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"84d17c3295a6b716a77384873892c694131b36c566b6c05af89285bf8e725573"} Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.332784 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" podStartSLOduration=1.449700253 podStartE2EDuration="9.332761756s" podCreationTimestamp="2025-10-09 19:47:28 +0000 UTC" firstStartedPulling="2025-10-09 19:47:28.966483426 +0000 UTC m=+1134.498450925" lastFinishedPulling="2025-10-09 19:47:36.849544929 +0000 UTC m=+1142.381512428" observedRunningTime="2025-10-09 19:47:37.321544931 +0000 UTC m=+1142.853512450" watchObservedRunningTime="2025-10-09 19:47:37.332761756 +0000 UTC m=+1142.864729245" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.337646 4907 scope.go:117] "RemoveContainer" containerID="86d032acf14b35b8e72c6120c149a990a98c8ede352d3378c1e4ab843847125d" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.340797 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef1b7f2-599e-4002-bbe9-a75cbfa1091a","Type":"ContainerDied","Data":"1ae70108a38d472cef5b39455a8cb75ce70b0706b8cd86e79d03c3ee6fc027cd"} Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.340892 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.360536 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.372148 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.384763 4907 scope.go:117] "RemoveContainer" containerID="e52c7a0fe32a558feb0415aa3280260b781b94dc2a29de298da06be1d8aa2d54" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.400815 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:47:37 crc kubenswrapper[4907]: E1009 19:47:37.401212 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerName="glance-log" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.401228 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerName="glance-log" Oct 09 19:47:37 crc kubenswrapper[4907]: E1009 19:47:37.401247 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerName="glance-httpd" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.401253 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerName="glance-httpd" Oct 09 19:47:37 crc kubenswrapper[4907]: E1009 19:47:37.401275 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerName="glance-log" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.401282 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerName="glance-log" Oct 09 19:47:37 crc kubenswrapper[4907]: E1009 19:47:37.401302 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerName="glance-httpd" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.401307 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerName="glance-httpd" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.401497 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerName="glance-log" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.401508 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerName="glance-httpd" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.401523 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" containerName="glance-log" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.401536 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" containerName="glance-httpd" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.402440 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.412948 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.413243 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sjjmv" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.413352 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.413455 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.414436 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.426041 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.427266 4907 scope.go:117] "RemoveContainer" containerID="6de742bdf70c0ce344dd3a058bb640a743aabfdefdf1c888528251a3b365dc53" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.440393 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.445429 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.447099 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.451279 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.451363 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.455541 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.462667 4907 scope.go:117] "RemoveContainer" containerID="c5bc7122d48b7c9b4bc650e0224ec9c293ab19affb07db61a86db0c52367167b" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.525776 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.525836 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.525878 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-logs\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.525906 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp9mr\" (UniqueName: \"kubernetes.io/projected/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-kube-api-access-gp9mr\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.525932 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.525951 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526362 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526447 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526513 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526587 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526620 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-logs\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526643 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526684 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z758c\" (UniqueName: \"kubernetes.io/projected/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-kube-api-access-z758c\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526706 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.526742 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628585 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-logs\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp9mr\" (UniqueName: \"kubernetes.io/projected/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-kube-api-access-gp9mr\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628707 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628741 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628775 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628813 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628844 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628873 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628905 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628946 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-logs\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.628969 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.629018 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z758c\" (UniqueName: \"kubernetes.io/projected/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-kube-api-access-z758c\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.629047 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.629092 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.629119 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.629140 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.633232 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-logs\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.633925 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.634112 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.634441 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-logs\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.635334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.636240 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.636621 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.636884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.638987 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.640181 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.650253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.652279 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.655044 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.655858 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.657387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z758c\" (UniqueName: \"kubernetes.io/projected/6bc9d4bf-354f-45c7-8116-6119f3f78b0c-kube-api-access-z758c\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.672988 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"6bc9d4bf-354f-45c7-8116-6119f3f78b0c\") " pod="openstack/glance-default-internal-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.677235 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp9mr\" (UniqueName: \"kubernetes.io/projected/ce1c21df-d6fe-46f5-b959-8c720f7b4fcb-kube-api-access-gp9mr\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.681311 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb\") " pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.728514 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 19:47:37 crc kubenswrapper[4907]: I1009 19:47:37.769965 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:38 crc kubenswrapper[4907]: I1009 19:47:38.246950 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 19:47:38 crc kubenswrapper[4907]: I1009 19:47:38.354366 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb","Type":"ContainerStarted","Data":"0a5da7e68db6cbbb7bc0c4294f274bbea49cc279c02bcb46dcc8daa4aafaf910"} Oct 09 19:47:38 crc kubenswrapper[4907]: I1009 19:47:38.356645 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerStarted","Data":"1fa565f67445aa35b89917021f2884eae223eea173ad929fa08ae0ce38c900f0"} Oct 09 19:47:38 crc kubenswrapper[4907]: I1009 19:47:38.356693 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerStarted","Data":"ee4070fc44b9db9e70783c8780089ca3de1305a5fde90720c5dcff3987689610"} Oct 09 19:47:38 crc kubenswrapper[4907]: I1009 19:47:38.422304 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 19:47:38 crc kubenswrapper[4907]: W1009 19:47:38.425153 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bc9d4bf_354f_45c7_8116_6119f3f78b0c.slice/crio-2272e83527e8f0289859c85c2644f6ff4eb944c176b697e6180bd59fadd912eb WatchSource:0}: Error finding container 2272e83527e8f0289859c85c2644f6ff4eb944c176b697e6180bd59fadd912eb: Status 404 returned error can't find the container with id 2272e83527e8f0289859c85c2644f6ff4eb944c176b697e6180bd59fadd912eb Oct 09 19:47:39 crc kubenswrapper[4907]: I1009 19:47:39.184566 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef1b7f2-599e-4002-bbe9-a75cbfa1091a" path="/var/lib/kubelet/pods/7ef1b7f2-599e-4002-bbe9-a75cbfa1091a/volumes" Oct 09 19:47:39 crc kubenswrapper[4907]: I1009 19:47:39.186288 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b076f1dc-3d4e-4be1-96d6-6e0a8229ff06" path="/var/lib/kubelet/pods/b076f1dc-3d4e-4be1-96d6-6e0a8229ff06/volumes" Oct 09 19:47:39 crc kubenswrapper[4907]: I1009 19:47:39.378679 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6bc9d4bf-354f-45c7-8116-6119f3f78b0c","Type":"ContainerStarted","Data":"147787778f353086e63ccc37f0e05c55e0bdfd810c58390f9d2bc0316f25c029"} Oct 09 19:47:39 crc kubenswrapper[4907]: I1009 19:47:39.378719 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6bc9d4bf-354f-45c7-8116-6119f3f78b0c","Type":"ContainerStarted","Data":"2272e83527e8f0289859c85c2644f6ff4eb944c176b697e6180bd59fadd912eb"} Oct 09 19:47:39 crc kubenswrapper[4907]: I1009 19:47:39.381443 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb","Type":"ContainerStarted","Data":"edb99167bf0fca35409871abd6397887f16d1782e3fe79ed354c0e5f338de484"} Oct 09 19:47:39 crc kubenswrapper[4907]: I1009 19:47:39.385161 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerStarted","Data":"2d919aced3330f0109b811a63dc252c84cd1e7aa0960809feb497b4c6d4fd762"} Oct 09 19:47:40 crc kubenswrapper[4907]: I1009 19:47:40.405262 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce1c21df-d6fe-46f5-b959-8c720f7b4fcb","Type":"ContainerStarted","Data":"c855365a5e58bd8ccdb8bea8c78dca8bc0c671a89925a1ba3f79effe6496ec4a"} Oct 09 19:47:40 crc kubenswrapper[4907]: I1009 19:47:40.414548 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6bc9d4bf-354f-45c7-8116-6119f3f78b0c","Type":"ContainerStarted","Data":"7cf1813ac6d07758f9cfe1df125d0802df3e0d33b3875004e5d3381d8a85f1bc"} Oct 09 19:47:40 crc kubenswrapper[4907]: I1009 19:47:40.450710 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.45069441 podStartE2EDuration="3.45069441s" podCreationTimestamp="2025-10-09 19:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:47:40.43031862 +0000 UTC m=+1145.962286119" watchObservedRunningTime="2025-10-09 19:47:40.45069441 +0000 UTC m=+1145.982661899" Oct 09 19:47:40 crc kubenswrapper[4907]: I1009 19:47:40.452038 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.452029343 podStartE2EDuration="3.452029343s" podCreationTimestamp="2025-10-09 19:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:47:40.447644735 +0000 UTC m=+1145.979612234" watchObservedRunningTime="2025-10-09 19:47:40.452029343 +0000 UTC m=+1145.983996832" Oct 09 19:47:41 crc kubenswrapper[4907]: I1009 19:47:41.425279 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="ceilometer-central-agent" containerID="cri-o://1fa565f67445aa35b89917021f2884eae223eea173ad929fa08ae0ce38c900f0" gracePeriod=30 Oct 09 19:47:41 crc kubenswrapper[4907]: I1009 19:47:41.425888 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerStarted","Data":"40a8be17144f2c7592a3005a980bf46f6f991701cf0f9d83635dd8f84223b711"} Oct 09 19:47:41 crc kubenswrapper[4907]: I1009 19:47:41.426536 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 19:47:41 crc kubenswrapper[4907]: I1009 19:47:41.426867 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="proxy-httpd" containerID="cri-o://40a8be17144f2c7592a3005a980bf46f6f991701cf0f9d83635dd8f84223b711" gracePeriod=30 Oct 09 19:47:41 crc kubenswrapper[4907]: I1009 19:47:41.426949 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="sg-core" containerID="cri-o://2d919aced3330f0109b811a63dc252c84cd1e7aa0960809feb497b4c6d4fd762" gracePeriod=30 Oct 09 19:47:41 crc kubenswrapper[4907]: I1009 19:47:41.426998 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="ceilometer-notification-agent" containerID="cri-o://ee4070fc44b9db9e70783c8780089ca3de1305a5fde90720c5dcff3987689610" gracePeriod=30 Oct 09 19:47:41 crc kubenswrapper[4907]: I1009 19:47:41.463380 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.190852505 podStartE2EDuration="10.463361388s" podCreationTimestamp="2025-10-09 19:47:31 +0000 UTC" firstStartedPulling="2025-10-09 19:47:36.186733807 +0000 UTC m=+1141.718701296" lastFinishedPulling="2025-10-09 19:47:40.45924269 +0000 UTC m=+1145.991210179" observedRunningTime="2025-10-09 19:47:41.457069143 +0000 UTC m=+1146.989036682" watchObservedRunningTime="2025-10-09 19:47:41.463361388 +0000 UTC m=+1146.995328887" Oct 09 19:47:42 crc kubenswrapper[4907]: I1009 19:47:42.434838 4907 generic.go:334] "Generic (PLEG): container finished" podID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerID="40a8be17144f2c7592a3005a980bf46f6f991701cf0f9d83635dd8f84223b711" exitCode=0 Oct 09 19:47:42 crc kubenswrapper[4907]: I1009 19:47:42.435123 4907 generic.go:334] "Generic (PLEG): container finished" podID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerID="2d919aced3330f0109b811a63dc252c84cd1e7aa0960809feb497b4c6d4fd762" exitCode=2 Oct 09 19:47:42 crc kubenswrapper[4907]: I1009 19:47:42.435133 4907 generic.go:334] "Generic (PLEG): container finished" podID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerID="ee4070fc44b9db9e70783c8780089ca3de1305a5fde90720c5dcff3987689610" exitCode=0 Oct 09 19:47:42 crc kubenswrapper[4907]: I1009 19:47:42.435007 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerDied","Data":"40a8be17144f2c7592a3005a980bf46f6f991701cf0f9d83635dd8f84223b711"} Oct 09 19:47:42 crc kubenswrapper[4907]: I1009 19:47:42.435165 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerDied","Data":"2d919aced3330f0109b811a63dc252c84cd1e7aa0960809feb497b4c6d4fd762"} Oct 09 19:47:42 crc kubenswrapper[4907]: I1009 19:47:42.435178 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerDied","Data":"ee4070fc44b9db9e70783c8780089ca3de1305a5fde90720c5dcff3987689610"} Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.479890 4907 generic.go:334] "Generic (PLEG): container finished" podID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerID="1fa565f67445aa35b89917021f2884eae223eea173ad929fa08ae0ce38c900f0" exitCode=0 Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.479958 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerDied","Data":"1fa565f67445aa35b89917021f2884eae223eea173ad929fa08ae0ce38c900f0"} Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.761564 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.887335 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-log-httpd\") pod \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.887408 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-scripts\") pod \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.887561 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-sg-core-conf-yaml\") pod \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.887608 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-config-data\") pod \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.887653 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-run-httpd\") pod \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.887721 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-combined-ca-bundle\") pod \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.887768 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv49c\" (UniqueName: \"kubernetes.io/projected/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-kube-api-access-rv49c\") pod \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\" (UID: \"b919ae31-d4a1-484d-a62c-a95df4ce8eb0\") " Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.888267 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b919ae31-d4a1-484d-a62c-a95df4ce8eb0" (UID: "b919ae31-d4a1-484d-a62c-a95df4ce8eb0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.888694 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.889216 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b919ae31-d4a1-484d-a62c-a95df4ce8eb0" (UID: "b919ae31-d4a1-484d-a62c-a95df4ce8eb0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.894099 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-scripts" (OuterVolumeSpecName: "scripts") pod "b919ae31-d4a1-484d-a62c-a95df4ce8eb0" (UID: "b919ae31-d4a1-484d-a62c-a95df4ce8eb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.894290 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-kube-api-access-rv49c" (OuterVolumeSpecName: "kube-api-access-rv49c") pod "b919ae31-d4a1-484d-a62c-a95df4ce8eb0" (UID: "b919ae31-d4a1-484d-a62c-a95df4ce8eb0"). InnerVolumeSpecName "kube-api-access-rv49c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.914845 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b919ae31-d4a1-484d-a62c-a95df4ce8eb0" (UID: "b919ae31-d4a1-484d-a62c-a95df4ce8eb0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.969575 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b919ae31-d4a1-484d-a62c-a95df4ce8eb0" (UID: "b919ae31-d4a1-484d-a62c-a95df4ce8eb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.989911 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.989947 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.989956 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.989968 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.989978 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv49c\" (UniqueName: \"kubernetes.io/projected/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-kube-api-access-rv49c\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:45 crc kubenswrapper[4907]: I1009 19:47:45.992100 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-config-data" (OuterVolumeSpecName: "config-data") pod "b919ae31-d4a1-484d-a62c-a95df4ce8eb0" (UID: "b919ae31-d4a1-484d-a62c-a95df4ce8eb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.091612 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b919ae31-d4a1-484d-a62c-a95df4ce8eb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.495908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b919ae31-d4a1-484d-a62c-a95df4ce8eb0","Type":"ContainerDied","Data":"bc3afcd9efa8d763a0a0eb7fca76cae586f4c44a6298ba3cf3f62506ba862fe8"} Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.496265 4907 scope.go:117] "RemoveContainer" containerID="40a8be17144f2c7592a3005a980bf46f6f991701cf0f9d83635dd8f84223b711" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.495997 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.532767 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.539939 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.542735 4907 scope.go:117] "RemoveContainer" containerID="2d919aced3330f0109b811a63dc252c84cd1e7aa0960809feb497b4c6d4fd762" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.567831 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:46 crc kubenswrapper[4907]: E1009 19:47:46.568428 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="sg-core" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.568545 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="sg-core" Oct 09 19:47:46 crc kubenswrapper[4907]: E1009 19:47:46.568564 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="ceilometer-central-agent" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.568574 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="ceilometer-central-agent" Oct 09 19:47:46 crc kubenswrapper[4907]: E1009 19:47:46.568617 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="proxy-httpd" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.568629 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="proxy-httpd" Oct 09 19:47:46 crc kubenswrapper[4907]: E1009 19:47:46.568647 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="ceilometer-notification-agent" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.568658 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="ceilometer-notification-agent" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.568950 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="proxy-httpd" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.568975 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="ceilometer-notification-agent" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.569015 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="sg-core" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.569042 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" containerName="ceilometer-central-agent" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.571793 4907 scope.go:117] "RemoveContainer" containerID="ee4070fc44b9db9e70783c8780089ca3de1305a5fde90720c5dcff3987689610" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.572416 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.575726 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.575863 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.578194 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.611648 4907 scope.go:117] "RemoveContainer" containerID="1fa565f67445aa35b89917021f2884eae223eea173ad929fa08ae0ce38c900f0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.704436 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-scripts\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.704507 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.704526 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-run-httpd\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.704569 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-config-data\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.704833 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-log-httpd\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.704941 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fnsv\" (UniqueName: \"kubernetes.io/projected/a805e428-d0d8-423c-b0c7-f8d7bbd31408-kube-api-access-5fnsv\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.705273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.810391 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-log-httpd\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.810524 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fnsv\" (UniqueName: \"kubernetes.io/projected/a805e428-d0d8-423c-b0c7-f8d7bbd31408-kube-api-access-5fnsv\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.810718 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.810900 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-scripts\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.810947 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-log-httpd\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.813347 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-run-httpd\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.813410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.813582 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-config-data\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.813778 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-run-httpd\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.818608 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.818817 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.819752 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-scripts\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.826593 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-config-data\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.839604 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fnsv\" (UniqueName: \"kubernetes.io/projected/a805e428-d0d8-423c-b0c7-f8d7bbd31408-kube-api-access-5fnsv\") pod \"ceilometer-0\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " pod="openstack/ceilometer-0" Oct 09 19:47:46 crc kubenswrapper[4907]: I1009 19:47:46.894850 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.162396 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b919ae31-d4a1-484d-a62c-a95df4ce8eb0" path="/var/lib/kubelet/pods/b919ae31-d4a1-484d-a62c-a95df4ce8eb0/volumes" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.366528 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.507515 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerStarted","Data":"69bf4531f3d22e07ddf0786ca22a324f91e3838628c17a1b1a73644f178c4671"} Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.728809 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.728866 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.757135 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.770144 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.773241 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.774349 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.818449 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:47 crc kubenswrapper[4907]: I1009 19:47:47.818997 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:48 crc kubenswrapper[4907]: I1009 19:47:48.520553 4907 generic.go:334] "Generic (PLEG): container finished" podID="7241797e-6008-4959-8314-f8100841d03c" containerID="6c205bb8dd05ab038799b5a521785fff3a16b781e2a2f9d8aa7e868cba6b1a70" exitCode=0 Oct 09 19:47:48 crc kubenswrapper[4907]: I1009 19:47:48.520886 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" event={"ID":"7241797e-6008-4959-8314-f8100841d03c","Type":"ContainerDied","Data":"6c205bb8dd05ab038799b5a521785fff3a16b781e2a2f9d8aa7e868cba6b1a70"} Oct 09 19:47:48 crc kubenswrapper[4907]: I1009 19:47:48.524041 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerStarted","Data":"564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa"} Oct 09 19:47:48 crc kubenswrapper[4907]: I1009 19:47:48.524694 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 19:47:48 crc kubenswrapper[4907]: I1009 19:47:48.524959 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:48 crc kubenswrapper[4907]: I1009 19:47:48.524994 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:48 crc kubenswrapper[4907]: I1009 19:47:48.525006 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 19:47:49 crc kubenswrapper[4907]: I1009 19:47:49.535593 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerStarted","Data":"2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27"} Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.041604 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.182856 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-combined-ca-bundle\") pod \"7241797e-6008-4959-8314-f8100841d03c\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.182994 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-scripts\") pod \"7241797e-6008-4959-8314-f8100841d03c\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.183016 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frx58\" (UniqueName: \"kubernetes.io/projected/7241797e-6008-4959-8314-f8100841d03c-kube-api-access-frx58\") pod \"7241797e-6008-4959-8314-f8100841d03c\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.183089 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-config-data\") pod \"7241797e-6008-4959-8314-f8100841d03c\" (UID: \"7241797e-6008-4959-8314-f8100841d03c\") " Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.187691 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-scripts" (OuterVolumeSpecName: "scripts") pod "7241797e-6008-4959-8314-f8100841d03c" (UID: "7241797e-6008-4959-8314-f8100841d03c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.188844 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7241797e-6008-4959-8314-f8100841d03c-kube-api-access-frx58" (OuterVolumeSpecName: "kube-api-access-frx58") pod "7241797e-6008-4959-8314-f8100841d03c" (UID: "7241797e-6008-4959-8314-f8100841d03c"). InnerVolumeSpecName "kube-api-access-frx58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.232580 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-config-data" (OuterVolumeSpecName: "config-data") pod "7241797e-6008-4959-8314-f8100841d03c" (UID: "7241797e-6008-4959-8314-f8100841d03c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.262766 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7241797e-6008-4959-8314-f8100841d03c" (UID: "7241797e-6008-4959-8314-f8100841d03c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.284720 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.284760 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.284770 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frx58\" (UniqueName: \"kubernetes.io/projected/7241797e-6008-4959-8314-f8100841d03c-kube-api-access-frx58\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.284780 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7241797e-6008-4959-8314-f8100841d03c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.533824 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.546173 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerStarted","Data":"515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558"} Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.547600 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.548412 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.549629 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5b5ql" event={"ID":"7241797e-6008-4959-8314-f8100841d03c","Type":"ContainerDied","Data":"141b4985a3dc0d41a25e24caea1b2ad361ca1a0c97cb48ca35840941c19c8bd7"} Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.549677 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="141b4985a3dc0d41a25e24caea1b2ad361ca1a0c97cb48ca35840941c19c8bd7" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.657246 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.665350 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 19:47:50 crc kubenswrapper[4907]: E1009 19:47:50.665798 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7241797e-6008-4959-8314-f8100841d03c" containerName="nova-cell0-conductor-db-sync" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.665817 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7241797e-6008-4959-8314-f8100841d03c" containerName="nova-cell0-conductor-db-sync" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.666046 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7241797e-6008-4959-8314-f8100841d03c" containerName="nova-cell0-conductor-db-sync" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.667621 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.674562 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.674635 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7cbd9" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.683242 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.683699 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.683797 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.745693 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.794255 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2qs\" (UniqueName: \"kubernetes.io/projected/92173588-8d80-440e-9dd4-62b132d5abed-kube-api-access-mz2qs\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.794358 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92173588-8d80-440e-9dd4-62b132d5abed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.794453 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92173588-8d80-440e-9dd4-62b132d5abed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.896094 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92173588-8d80-440e-9dd4-62b132d5abed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.896210 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92173588-8d80-440e-9dd4-62b132d5abed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.896315 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2qs\" (UniqueName: \"kubernetes.io/projected/92173588-8d80-440e-9dd4-62b132d5abed-kube-api-access-mz2qs\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.908595 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92173588-8d80-440e-9dd4-62b132d5abed-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.916105 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92173588-8d80-440e-9dd4-62b132d5abed-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:50 crc kubenswrapper[4907]: I1009 19:47:50.936019 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2qs\" (UniqueName: \"kubernetes.io/projected/92173588-8d80-440e-9dd4-62b132d5abed-kube-api-access-mz2qs\") pod \"nova-cell0-conductor-0\" (UID: \"92173588-8d80-440e-9dd4-62b132d5abed\") " pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:51 crc kubenswrapper[4907]: I1009 19:47:50.997802 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:51 crc kubenswrapper[4907]: W1009 19:47:51.528223 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92173588_8d80_440e_9dd4_62b132d5abed.slice/crio-62cce2d44467851b5274e71f6bfe4f60c8de533a438e225d927c8a1b61e22b54 WatchSource:0}: Error finding container 62cce2d44467851b5274e71f6bfe4f60c8de533a438e225d927c8a1b61e22b54: Status 404 returned error can't find the container with id 62cce2d44467851b5274e71f6bfe4f60c8de533a438e225d927c8a1b61e22b54 Oct 09 19:47:51 crc kubenswrapper[4907]: I1009 19:47:51.528698 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 19:47:51 crc kubenswrapper[4907]: I1009 19:47:51.557619 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"92173588-8d80-440e-9dd4-62b132d5abed","Type":"ContainerStarted","Data":"62cce2d44467851b5274e71f6bfe4f60c8de533a438e225d927c8a1b61e22b54"} Oct 09 19:47:52 crc kubenswrapper[4907]: I1009 19:47:52.568390 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"92173588-8d80-440e-9dd4-62b132d5abed","Type":"ContainerStarted","Data":"4fe871aebb6c5c15af46e17b0b62950bc5e97f9ba9e35f445f9951163b0381df"} Oct 09 19:47:52 crc kubenswrapper[4907]: I1009 19:47:52.568857 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 09 19:47:52 crc kubenswrapper[4907]: I1009 19:47:52.571435 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerStarted","Data":"eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075"} Oct 09 19:47:52 crc kubenswrapper[4907]: I1009 19:47:52.571944 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 19:47:52 crc kubenswrapper[4907]: I1009 19:47:52.594719 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5946955149999997 podStartE2EDuration="2.594695515s" podCreationTimestamp="2025-10-09 19:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:47:52.589867806 +0000 UTC m=+1158.121835305" watchObservedRunningTime="2025-10-09 19:47:52.594695515 +0000 UTC m=+1158.126663004" Oct 09 19:47:52 crc kubenswrapper[4907]: I1009 19:47:52.618616 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8907974159999998 podStartE2EDuration="6.618577381s" podCreationTimestamp="2025-10-09 19:47:46 +0000 UTC" firstStartedPulling="2025-10-09 19:47:47.371881413 +0000 UTC m=+1152.903848902" lastFinishedPulling="2025-10-09 19:47:52.099661388 +0000 UTC m=+1157.631628867" observedRunningTime="2025-10-09 19:47:52.609727684 +0000 UTC m=+1158.141695183" watchObservedRunningTime="2025-10-09 19:47:52.618577381 +0000 UTC m=+1158.150544860" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.026106 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.468227 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-psdhs"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.469632 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.473874 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.474118 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.482190 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-psdhs"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.493225 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-scripts\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.493279 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.493309 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-config-data\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.493582 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnnkz\" (UniqueName: \"kubernetes.io/projected/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-kube-api-access-nnnkz\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.598742 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnnkz\" (UniqueName: \"kubernetes.io/projected/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-kube-api-access-nnnkz\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.598847 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-scripts\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.598877 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.598901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-config-data\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.606046 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-config-data\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.606917 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-scripts\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.609203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.621643 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnnkz\" (UniqueName: \"kubernetes.io/projected/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-kube-api-access-nnnkz\") pod \"nova-cell0-cell-mapping-psdhs\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.687753 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.689568 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.694856 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.701543 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.703638 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.707252 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.710375 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.729242 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.793028 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.806396 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325af37-9aae-4a1d-bfc1-b07ccca70588-logs\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.806445 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-config-data\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.806524 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp6sw\" (UniqueName: \"kubernetes.io/projected/a36484d9-eee6-44ce-9c42-3591dd75d1fa-kube-api-access-bp6sw\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.806577 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a36484d9-eee6-44ce-9c42-3591dd75d1fa-logs\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.806644 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.806668 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-config-data\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.806689 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78pv8\" (UniqueName: \"kubernetes.io/projected/5325af37-9aae-4a1d-bfc1-b07ccca70588-kube-api-access-78pv8\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.806712 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.834587 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.854816 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.854944 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.862498 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.863816 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.863967 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.897126 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935646 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935688 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-config-data\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935716 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zd7z\" (UniqueName: \"kubernetes.io/projected/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-kube-api-access-7zd7z\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935749 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78pv8\" (UniqueName: \"kubernetes.io/projected/5325af37-9aae-4a1d-bfc1-b07ccca70588-kube-api-access-78pv8\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935776 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935799 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935822 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-config-data\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325af37-9aae-4a1d-bfc1-b07ccca70588-logs\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935874 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-config-data\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp6sw\" (UniqueName: \"kubernetes.io/projected/a36484d9-eee6-44ce-9c42-3591dd75d1fa-kube-api-access-bp6sw\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935952 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.935986 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfrb5\" (UniqueName: \"kubernetes.io/projected/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-kube-api-access-wfrb5\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.936010 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a36484d9-eee6-44ce-9c42-3591dd75d1fa-logs\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.936420 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a36484d9-eee6-44ce-9c42-3591dd75d1fa-logs\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.952119 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.954070 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325af37-9aae-4a1d-bfc1-b07ccca70588-logs\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:01 crc kubenswrapper[4907]: I1009 19:48:01.998987 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-tkk4m"] Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.003970 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.028966 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.031431 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-config-data\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.037234 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhcl\" (UniqueName: \"kubernetes.io/projected/ea4853ad-8428-43e1-839b-788a9e672eec-kube-api-access-mjhcl\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.037276 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.037309 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-config\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.037329 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.037358 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.037393 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zd7z\" (UniqueName: \"kubernetes.io/projected/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-kube-api-access-7zd7z\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.037432 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.044510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-config-data\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.045295 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.045343 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.049147 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfrb5\" (UniqueName: \"kubernetes.io/projected/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-kube-api-access-wfrb5\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.049229 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.070230 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-tkk4m"] Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.071268 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78pv8\" (UniqueName: \"kubernetes.io/projected/5325af37-9aae-4a1d-bfc1-b07ccca70588-kube-api-access-78pv8\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.075691 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.075792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-config-data\") pod \"nova-api-0\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " pod="openstack/nova-api-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.080270 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.082312 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.082759 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-config-data\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.083673 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.116568 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfrb5\" (UniqueName: \"kubernetes.io/projected/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-kube-api-access-wfrb5\") pod \"nova-scheduler-0\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.127201 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp6sw\" (UniqueName: \"kubernetes.io/projected/a36484d9-eee6-44ce-9c42-3591dd75d1fa-kube-api-access-bp6sw\") pod \"nova-metadata-0\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " pod="openstack/nova-metadata-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.144235 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zd7z\" (UniqueName: \"kubernetes.io/projected/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-kube-api-access-7zd7z\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.152575 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.152746 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.152769 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhcl\" (UniqueName: \"kubernetes.io/projected/ea4853ad-8428-43e1-839b-788a9e672eec-kube-api-access-mjhcl\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.152803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.153029 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-config\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.153045 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.153847 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.154328 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.157044 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.157377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-config\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.162167 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.188543 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhcl\" (UniqueName: \"kubernetes.io/projected/ea4853ad-8428-43e1-839b-788a9e672eec-kube-api-access-mjhcl\") pod \"dnsmasq-dns-845d6d6f59-tkk4m\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.333860 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.347157 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.358195 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.385542 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.426089 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.647608 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-psdhs"] Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.701022 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-psdhs" event={"ID":"3525ecd6-fd9f-47bd-b83d-7eb303d3032c","Type":"ContainerStarted","Data":"9972355f650205d9c0b87dcc1e0f52effbebb6cbe730aa0b8a5df247b6e9525c"} Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.825205 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6v72h"] Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.830967 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.835665 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.835919 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.854489 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6v72h"] Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.883558 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-config-data\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.884237 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2dt\" (UniqueName: \"kubernetes.io/projected/4360d8b6-f761-4e20-acd0-3cb6580dd756-kube-api-access-xr2dt\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.884355 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-scripts\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.884588 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.988304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.988355 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-config-data\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.988438 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2dt\" (UniqueName: \"kubernetes.io/projected/4360d8b6-f761-4e20-acd0-3cb6580dd756-kube-api-access-xr2dt\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.988566 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-scripts\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:02 crc kubenswrapper[4907]: I1009 19:48:02.991739 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.003524 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-config-data\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.005480 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-scripts\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.009166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.014346 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2dt\" (UniqueName: \"kubernetes.io/projected/4360d8b6-f761-4e20-acd0-3cb6580dd756-kube-api-access-xr2dt\") pod \"nova-cell1-conductor-db-sync-6v72h\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:03 crc kubenswrapper[4907]: W1009 19:48:03.126654 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5325af37_9aae_4a1d_bfc1_b07ccca70588.slice/crio-7cf038b6d89d437e33d1defeace4e0e6c99f45bfe3c2317f35da26ae07c774d0 WatchSource:0}: Error finding container 7cf038b6d89d437e33d1defeace4e0e6c99f45bfe3c2317f35da26ae07c774d0: Status 404 returned error can't find the container with id 7cf038b6d89d437e33d1defeace4e0e6c99f45bfe3c2317f35da26ae07c774d0 Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.135098 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.165692 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.174697 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.243994 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.351684 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-tkk4m"] Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.710997 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a36484d9-eee6-44ce-9c42-3591dd75d1fa","Type":"ContainerStarted","Data":"50a8a21d9b027e95184919ccc7a4149ecef76179d83a9d00172d48292d4104fa"} Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.712226 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7705908-7b6b-4a07-91bd-1d2e03bd8e22","Type":"ContainerStarted","Data":"4f51253f10019ca3fca92d95fbd69047ab78a84247e82b8c693c70abf5e5ccbf"} Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.714162 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-psdhs" event={"ID":"3525ecd6-fd9f-47bd-b83d-7eb303d3032c","Type":"ContainerStarted","Data":"bb9fbdd7e0b4569504bebb45b35f4e77651f747819e5c6d15b631d8f4b689e15"} Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.715689 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54","Type":"ContainerStarted","Data":"3813c3e90618c6399f882a461fc1435e959cecfad3ac1b1c226b16e846b27ee7"} Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.717266 4907 generic.go:334] "Generic (PLEG): container finished" podID="ea4853ad-8428-43e1-839b-788a9e672eec" containerID="4a6e1fdaaedf653a99498956c11e865d2f8110f4af0a352c36bbdb16cbd750e4" exitCode=0 Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.717328 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" event={"ID":"ea4853ad-8428-43e1-839b-788a9e672eec","Type":"ContainerDied","Data":"4a6e1fdaaedf653a99498956c11e865d2f8110f4af0a352c36bbdb16cbd750e4"} Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.717345 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" event={"ID":"ea4853ad-8428-43e1-839b-788a9e672eec","Type":"ContainerStarted","Data":"d48631948bc0c7a8f3e99fbd80707ecc578095cc366aba5b854ed7d02393b922"} Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.718965 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325af37-9aae-4a1d-bfc1-b07ccca70588","Type":"ContainerStarted","Data":"7cf038b6d89d437e33d1defeace4e0e6c99f45bfe3c2317f35da26ae07c774d0"} Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.737384 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-psdhs" podStartSLOduration=2.737363608 podStartE2EDuration="2.737363608s" podCreationTimestamp="2025-10-09 19:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:03.731079986 +0000 UTC m=+1169.263047475" watchObservedRunningTime="2025-10-09 19:48:03.737363608 +0000 UTC m=+1169.269331097" Oct 09 19:48:03 crc kubenswrapper[4907]: I1009 19:48:03.812024 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6v72h"] Oct 09 19:48:04 crc kubenswrapper[4907]: I1009 19:48:04.775202 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" event={"ID":"ea4853ad-8428-43e1-839b-788a9e672eec","Type":"ContainerStarted","Data":"cab012b751061895123e234f0961d089c5b413dab465ae86c1432711255f56ee"} Oct 09 19:48:04 crc kubenswrapper[4907]: I1009 19:48:04.776570 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:04 crc kubenswrapper[4907]: I1009 19:48:04.777413 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6v72h" event={"ID":"4360d8b6-f761-4e20-acd0-3cb6580dd756","Type":"ContainerStarted","Data":"f847f77253380e5ef8d7eecf30888d4d43e1510e07098e469eebea69a9aa68ec"} Oct 09 19:48:04 crc kubenswrapper[4907]: I1009 19:48:04.777455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6v72h" event={"ID":"4360d8b6-f761-4e20-acd0-3cb6580dd756","Type":"ContainerStarted","Data":"6d8a6e6d6ef93f5079815d9f9bf22b4c93a7de0fcc4f66ad7374c8ccf71f19bd"} Oct 09 19:48:04 crc kubenswrapper[4907]: I1009 19:48:04.813722 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" podStartSLOduration=3.813703993 podStartE2EDuration="3.813703993s" podCreationTimestamp="2025-10-09 19:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:04.799239032 +0000 UTC m=+1170.331206541" watchObservedRunningTime="2025-10-09 19:48:04.813703993 +0000 UTC m=+1170.345671482" Oct 09 19:48:04 crc kubenswrapper[4907]: I1009 19:48:04.819454 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-6v72h" podStartSLOduration=2.819434242 podStartE2EDuration="2.819434242s" podCreationTimestamp="2025-10-09 19:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:04.817015873 +0000 UTC m=+1170.348983372" watchObservedRunningTime="2025-10-09 19:48:04.819434242 +0000 UTC m=+1170.351401721" Oct 09 19:48:05 crc kubenswrapper[4907]: I1009 19:48:05.479307 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:05 crc kubenswrapper[4907]: I1009 19:48:05.488443 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.841314 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325af37-9aae-4a1d-bfc1-b07ccca70588","Type":"ContainerStarted","Data":"1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34"} Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.842006 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325af37-9aae-4a1d-bfc1-b07ccca70588","Type":"ContainerStarted","Data":"c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd"} Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.846611 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a36484d9-eee6-44ce-9c42-3591dd75d1fa","Type":"ContainerStarted","Data":"89aa8d8d8425e8422b1529293400341f0150518dbeb7b37f1c94bb23921f39e2"} Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.846688 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a36484d9-eee6-44ce-9c42-3591dd75d1fa","Type":"ContainerStarted","Data":"8bb7798b7f48d2c170ea28b27869ef62bf71fff19424cdd8ab19315513ee6ba8"} Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.846637 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerName="nova-metadata-log" containerID="cri-o://8bb7798b7f48d2c170ea28b27869ef62bf71fff19424cdd8ab19315513ee6ba8" gracePeriod=30 Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.846687 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerName="nova-metadata-metadata" containerID="cri-o://89aa8d8d8425e8422b1529293400341f0150518dbeb7b37f1c94bb23921f39e2" gracePeriod=30 Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.855872 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7705908-7b6b-4a07-91bd-1d2e03bd8e22","Type":"ContainerStarted","Data":"53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc"} Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.856253 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b7705908-7b6b-4a07-91bd-1d2e03bd8e22" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc" gracePeriod=30 Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.865959 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.438962845 podStartE2EDuration="6.865936604s" podCreationTimestamp="2025-10-09 19:48:01 +0000 UTC" firstStartedPulling="2025-10-09 19:48:03.133691124 +0000 UTC m=+1168.665658603" lastFinishedPulling="2025-10-09 19:48:06.560664873 +0000 UTC m=+1172.092632362" observedRunningTime="2025-10-09 19:48:07.859946819 +0000 UTC m=+1173.391914328" watchObservedRunningTime="2025-10-09 19:48:07.865936604 +0000 UTC m=+1173.397904093" Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.866736 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54","Type":"ContainerStarted","Data":"275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750"} Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.885481 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.477679292 podStartE2EDuration="6.885430026s" podCreationTimestamp="2025-10-09 19:48:01 +0000 UTC" firstStartedPulling="2025-10-09 19:48:03.147010846 +0000 UTC m=+1168.678978335" lastFinishedPulling="2025-10-09 19:48:06.55476158 +0000 UTC m=+1172.086729069" observedRunningTime="2025-10-09 19:48:07.884822791 +0000 UTC m=+1173.416790310" watchObservedRunningTime="2025-10-09 19:48:07.885430026 +0000 UTC m=+1173.417397525" Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.905974 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.352441569 podStartE2EDuration="6.905957103s" podCreationTimestamp="2025-10-09 19:48:01 +0000 UTC" firstStartedPulling="2025-10-09 19:48:03.001597454 +0000 UTC m=+1168.533564943" lastFinishedPulling="2025-10-09 19:48:06.555112978 +0000 UTC m=+1172.087080477" observedRunningTime="2025-10-09 19:48:07.905591424 +0000 UTC m=+1173.437558933" watchObservedRunningTime="2025-10-09 19:48:07.905957103 +0000 UTC m=+1173.437924592" Oct 09 19:48:07 crc kubenswrapper[4907]: I1009 19:48:07.927268 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.53329349 podStartE2EDuration="6.927250139s" podCreationTimestamp="2025-10-09 19:48:01 +0000 UTC" firstStartedPulling="2025-10-09 19:48:03.16077888 +0000 UTC m=+1168.692746369" lastFinishedPulling="2025-10-09 19:48:06.554735519 +0000 UTC m=+1172.086703018" observedRunningTime="2025-10-09 19:48:07.922641897 +0000 UTC m=+1173.454609416" watchObservedRunningTime="2025-10-09 19:48:07.927250139 +0000 UTC m=+1173.459217628" Oct 09 19:48:08 crc kubenswrapper[4907]: I1009 19:48:08.941683 4907 generic.go:334] "Generic (PLEG): container finished" podID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerID="89aa8d8d8425e8422b1529293400341f0150518dbeb7b37f1c94bb23921f39e2" exitCode=0 Oct 09 19:48:08 crc kubenswrapper[4907]: I1009 19:48:08.941985 4907 generic.go:334] "Generic (PLEG): container finished" podID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerID="8bb7798b7f48d2c170ea28b27869ef62bf71fff19424cdd8ab19315513ee6ba8" exitCode=143 Oct 09 19:48:08 crc kubenswrapper[4907]: I1009 19:48:08.943285 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a36484d9-eee6-44ce-9c42-3591dd75d1fa","Type":"ContainerDied","Data":"89aa8d8d8425e8422b1529293400341f0150518dbeb7b37f1c94bb23921f39e2"} Oct 09 19:48:08 crc kubenswrapper[4907]: I1009 19:48:08.943322 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a36484d9-eee6-44ce-9c42-3591dd75d1fa","Type":"ContainerDied","Data":"8bb7798b7f48d2c170ea28b27869ef62bf71fff19424cdd8ab19315513ee6ba8"} Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.171189 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.231304 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-config-data\") pod \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.231499 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-combined-ca-bundle\") pod \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.231533 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a36484d9-eee6-44ce-9c42-3591dd75d1fa-logs\") pod \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.231554 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp6sw\" (UniqueName: \"kubernetes.io/projected/a36484d9-eee6-44ce-9c42-3591dd75d1fa-kube-api-access-bp6sw\") pod \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\" (UID: \"a36484d9-eee6-44ce-9c42-3591dd75d1fa\") " Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.231873 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36484d9-eee6-44ce-9c42-3591dd75d1fa-logs" (OuterVolumeSpecName: "logs") pod "a36484d9-eee6-44ce-9c42-3591dd75d1fa" (UID: "a36484d9-eee6-44ce-9c42-3591dd75d1fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.232597 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a36484d9-eee6-44ce-9c42-3591dd75d1fa-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.259866 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36484d9-eee6-44ce-9c42-3591dd75d1fa-kube-api-access-bp6sw" (OuterVolumeSpecName: "kube-api-access-bp6sw") pod "a36484d9-eee6-44ce-9c42-3591dd75d1fa" (UID: "a36484d9-eee6-44ce-9c42-3591dd75d1fa"). InnerVolumeSpecName "kube-api-access-bp6sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.268561 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-config-data" (OuterVolumeSpecName: "config-data") pod "a36484d9-eee6-44ce-9c42-3591dd75d1fa" (UID: "a36484d9-eee6-44ce-9c42-3591dd75d1fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.271998 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a36484d9-eee6-44ce-9c42-3591dd75d1fa" (UID: "a36484d9-eee6-44ce-9c42-3591dd75d1fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.333812 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.333848 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp6sw\" (UniqueName: \"kubernetes.io/projected/a36484d9-eee6-44ce-9c42-3591dd75d1fa-kube-api-access-bp6sw\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.333858 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36484d9-eee6-44ce-9c42-3591dd75d1fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.956425 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a36484d9-eee6-44ce-9c42-3591dd75d1fa","Type":"ContainerDied","Data":"50a8a21d9b027e95184919ccc7a4149ecef76179d83a9d00172d48292d4104fa"} Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.956491 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.957209 4907 scope.go:117] "RemoveContainer" containerID="89aa8d8d8425e8422b1529293400341f0150518dbeb7b37f1c94bb23921f39e2" Oct 09 19:48:09 crc kubenswrapper[4907]: I1009 19:48:09.997104 4907 scope.go:117] "RemoveContainer" containerID="8bb7798b7f48d2c170ea28b27869ef62bf71fff19424cdd8ab19315513ee6ba8" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.002530 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.015559 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.037342 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:10 crc kubenswrapper[4907]: E1009 19:48:10.038061 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerName="nova-metadata-log" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.038146 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerName="nova-metadata-log" Oct 09 19:48:10 crc kubenswrapper[4907]: E1009 19:48:10.038245 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerName="nova-metadata-metadata" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.038345 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerName="nova-metadata-metadata" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.038615 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerName="nova-metadata-metadata" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.038711 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" containerName="nova-metadata-log" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.041533 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.047015 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.047224 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.059925 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.150569 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.150825 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.151086 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdcf38c0-3a92-4c35-85fb-8cd760308083-logs\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.151148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-config-data\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.151218 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnc6z\" (UniqueName: \"kubernetes.io/projected/fdcf38c0-3a92-4c35-85fb-8cd760308083-kube-api-access-dnc6z\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.254002 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.254194 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdcf38c0-3a92-4c35-85fb-8cd760308083-logs\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.254254 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-config-data\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.254362 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnc6z\" (UniqueName: \"kubernetes.io/projected/fdcf38c0-3a92-4c35-85fb-8cd760308083-kube-api-access-dnc6z\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.254447 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.255963 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdcf38c0-3a92-4c35-85fb-8cd760308083-logs\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.258417 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.259329 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.270845 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-config-data\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.273985 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnc6z\" (UniqueName: \"kubernetes.io/projected/fdcf38c0-3a92-4c35-85fb-8cd760308083-kube-api-access-dnc6z\") pod \"nova-metadata-0\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.375134 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.893458 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:10 crc kubenswrapper[4907]: I1009 19:48:10.965186 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdcf38c0-3a92-4c35-85fb-8cd760308083","Type":"ContainerStarted","Data":"cd7e9699e976ce35c4d2164ab36d6bba06e0b999766a9743236f3f07c3a4e4bc"} Oct 09 19:48:11 crc kubenswrapper[4907]: I1009 19:48:11.161842 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36484d9-eee6-44ce-9c42-3591dd75d1fa" path="/var/lib/kubelet/pods/a36484d9-eee6-44ce-9c42-3591dd75d1fa/volumes" Oct 09 19:48:11 crc kubenswrapper[4907]: I1009 19:48:11.988090 4907 generic.go:334] "Generic (PLEG): container finished" podID="3525ecd6-fd9f-47bd-b83d-7eb303d3032c" containerID="bb9fbdd7e0b4569504bebb45b35f4e77651f747819e5c6d15b631d8f4b689e15" exitCode=0 Oct 09 19:48:11 crc kubenswrapper[4907]: I1009 19:48:11.988154 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-psdhs" event={"ID":"3525ecd6-fd9f-47bd-b83d-7eb303d3032c","Type":"ContainerDied","Data":"bb9fbdd7e0b4569504bebb45b35f4e77651f747819e5c6d15b631d8f4b689e15"} Oct 09 19:48:11 crc kubenswrapper[4907]: I1009 19:48:11.992827 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdcf38c0-3a92-4c35-85fb-8cd760308083","Type":"ContainerStarted","Data":"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e"} Oct 09 19:48:11 crc kubenswrapper[4907]: I1009 19:48:11.992872 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdcf38c0-3a92-4c35-85fb-8cd760308083","Type":"ContainerStarted","Data":"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139"} Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.335063 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.348503 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.348596 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.386403 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.386690 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.413519 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.428645 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.437074 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.43704709 podStartE2EDuration="3.43704709s" podCreationTimestamp="2025-10-09 19:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:12.046477719 +0000 UTC m=+1177.578445218" watchObservedRunningTime="2025-10-09 19:48:12.43704709 +0000 UTC m=+1177.969014609" Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.516397 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78t58"] Oct 09 19:48:12 crc kubenswrapper[4907]: I1009 19:48:12.516684 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-78t58" podUID="a84c9813-0bcd-4c28-aa91-219dec410336" containerName="dnsmasq-dns" containerID="cri-o://fddbf440ebdeefbe0a53dcebd417d27aeb9d01507c8789be83fbde014e053576" gracePeriod=10 Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.006136 4907 generic.go:334] "Generic (PLEG): container finished" podID="a84c9813-0bcd-4c28-aa91-219dec410336" containerID="fddbf440ebdeefbe0a53dcebd417d27aeb9d01507c8789be83fbde014e053576" exitCode=0 Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.006314 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78t58" event={"ID":"a84c9813-0bcd-4c28-aa91-219dec410336","Type":"ContainerDied","Data":"fddbf440ebdeefbe0a53dcebd417d27aeb9d01507c8789be83fbde014e053576"} Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.041739 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.435594 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.436196 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.656865 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.668233 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842141 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-config\") pod \"a84c9813-0bcd-4c28-aa91-219dec410336\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842197 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-svc\") pod \"a84c9813-0bcd-4c28-aa91-219dec410336\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842213 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-sb\") pod \"a84c9813-0bcd-4c28-aa91-219dec410336\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842233 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-swift-storage-0\") pod \"a84c9813-0bcd-4c28-aa91-219dec410336\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842262 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-config-data\") pod \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842298 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnnkz\" (UniqueName: \"kubernetes.io/projected/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-kube-api-access-nnnkz\") pod \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842323 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-scripts\") pod \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842348 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-combined-ca-bundle\") pod \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\" (UID: \"3525ecd6-fd9f-47bd-b83d-7eb303d3032c\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842372 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-nb\") pod \"a84c9813-0bcd-4c28-aa91-219dec410336\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.842404 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25csk\" (UniqueName: \"kubernetes.io/projected/a84c9813-0bcd-4c28-aa91-219dec410336-kube-api-access-25csk\") pod \"a84c9813-0bcd-4c28-aa91-219dec410336\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.861215 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-kube-api-access-nnnkz" (OuterVolumeSpecName: "kube-api-access-nnnkz") pod "3525ecd6-fd9f-47bd-b83d-7eb303d3032c" (UID: "3525ecd6-fd9f-47bd-b83d-7eb303d3032c"). InnerVolumeSpecName "kube-api-access-nnnkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.865786 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84c9813-0bcd-4c28-aa91-219dec410336-kube-api-access-25csk" (OuterVolumeSpecName: "kube-api-access-25csk") pod "a84c9813-0bcd-4c28-aa91-219dec410336" (UID: "a84c9813-0bcd-4c28-aa91-219dec410336"). InnerVolumeSpecName "kube-api-access-25csk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.890063 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-scripts" (OuterVolumeSpecName: "scripts") pod "3525ecd6-fd9f-47bd-b83d-7eb303d3032c" (UID: "3525ecd6-fd9f-47bd-b83d-7eb303d3032c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.910226 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a84c9813-0bcd-4c28-aa91-219dec410336" (UID: "a84c9813-0bcd-4c28-aa91-219dec410336"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.924947 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3525ecd6-fd9f-47bd-b83d-7eb303d3032c" (UID: "3525ecd6-fd9f-47bd-b83d-7eb303d3032c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.932576 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-config-data" (OuterVolumeSpecName: "config-data") pod "3525ecd6-fd9f-47bd-b83d-7eb303d3032c" (UID: "3525ecd6-fd9f-47bd-b83d-7eb303d3032c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.944865 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-config" (OuterVolumeSpecName: "config") pod "a84c9813-0bcd-4c28-aa91-219dec410336" (UID: "a84c9813-0bcd-4c28-aa91-219dec410336"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945098 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-config\") pod \"a84c9813-0bcd-4c28-aa91-219dec410336\" (UID: \"a84c9813-0bcd-4c28-aa91-219dec410336\") " Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945363 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a84c9813-0bcd-4c28-aa91-219dec410336" (UID: "a84c9813-0bcd-4c28-aa91-219dec410336"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: W1009 19:48:13.945546 4907 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a84c9813-0bcd-4c28-aa91-219dec410336/volumes/kubernetes.io~configmap/config Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945559 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-config" (OuterVolumeSpecName: "config") pod "a84c9813-0bcd-4c28-aa91-219dec410336" (UID: "a84c9813-0bcd-4c28-aa91-219dec410336"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945812 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945831 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945848 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945859 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnnkz\" (UniqueName: \"kubernetes.io/projected/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-kube-api-access-nnnkz\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945870 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945884 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3525ecd6-fd9f-47bd-b83d-7eb303d3032c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945894 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.945906 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25csk\" (UniqueName: \"kubernetes.io/projected/a84c9813-0bcd-4c28-aa91-219dec410336-kube-api-access-25csk\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.954261 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a84c9813-0bcd-4c28-aa91-219dec410336" (UID: "a84c9813-0bcd-4c28-aa91-219dec410336"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:13 crc kubenswrapper[4907]: I1009 19:48:13.988873 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a84c9813-0bcd-4c28-aa91-219dec410336" (UID: "a84c9813-0bcd-4c28-aa91-219dec410336"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.017963 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-psdhs" event={"ID":"3525ecd6-fd9f-47bd-b83d-7eb303d3032c","Type":"ContainerDied","Data":"9972355f650205d9c0b87dcc1e0f52effbebb6cbe730aa0b8a5df247b6e9525c"} Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.018079 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9972355f650205d9c0b87dcc1e0f52effbebb6cbe730aa0b8a5df247b6e9525c" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.017977 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-psdhs" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.020127 4907 generic.go:334] "Generic (PLEG): container finished" podID="4360d8b6-f761-4e20-acd0-3cb6580dd756" containerID="f847f77253380e5ef8d7eecf30888d4d43e1510e07098e469eebea69a9aa68ec" exitCode=0 Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.020187 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6v72h" event={"ID":"4360d8b6-f761-4e20-acd0-3cb6580dd756","Type":"ContainerDied","Data":"f847f77253380e5ef8d7eecf30888d4d43e1510e07098e469eebea69a9aa68ec"} Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.026118 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-78t58" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.030023 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-78t58" event={"ID":"a84c9813-0bcd-4c28-aa91-219dec410336","Type":"ContainerDied","Data":"f80f3b94efe5c17942c7c0332bc9a04b89df9172486489b5479a2b729e04240e"} Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.030085 4907 scope.go:117] "RemoveContainer" containerID="fddbf440ebdeefbe0a53dcebd417d27aeb9d01507c8789be83fbde014e053576" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.049569 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.049597 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a84c9813-0bcd-4c28-aa91-219dec410336-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.111650 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78t58"] Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.117459 4907 scope.go:117] "RemoveContainer" containerID="bf18b249579a836b1b5c65c2dd0ced4c4d667f0d408d962d9de8b0c46771bd08" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.119454 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-78t58"] Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.206346 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.216885 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.217234 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-log" containerID="cri-o://c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd" gracePeriod=30 Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.217336 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-api" containerID="cri-o://1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34" gracePeriod=30 Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.230086 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.230329 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerName="nova-metadata-log" containerID="cri-o://c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139" gracePeriod=30 Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.230510 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerName="nova-metadata-metadata" containerID="cri-o://65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e" gracePeriod=30 Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.691690 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.864603 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-nova-metadata-tls-certs\") pod \"fdcf38c0-3a92-4c35-85fb-8cd760308083\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.864974 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-combined-ca-bundle\") pod \"fdcf38c0-3a92-4c35-85fb-8cd760308083\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.865457 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdcf38c0-3a92-4c35-85fb-8cd760308083-logs\") pod \"fdcf38c0-3a92-4c35-85fb-8cd760308083\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.865704 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-config-data\") pod \"fdcf38c0-3a92-4c35-85fb-8cd760308083\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.865977 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnc6z\" (UniqueName: \"kubernetes.io/projected/fdcf38c0-3a92-4c35-85fb-8cd760308083-kube-api-access-dnc6z\") pod \"fdcf38c0-3a92-4c35-85fb-8cd760308083\" (UID: \"fdcf38c0-3a92-4c35-85fb-8cd760308083\") " Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.866077 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdcf38c0-3a92-4c35-85fb-8cd760308083-logs" (OuterVolumeSpecName: "logs") pod "fdcf38c0-3a92-4c35-85fb-8cd760308083" (UID: "fdcf38c0-3a92-4c35-85fb-8cd760308083"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.867245 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdcf38c0-3a92-4c35-85fb-8cd760308083-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.870716 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdcf38c0-3a92-4c35-85fb-8cd760308083-kube-api-access-dnc6z" (OuterVolumeSpecName: "kube-api-access-dnc6z") pod "fdcf38c0-3a92-4c35-85fb-8cd760308083" (UID: "fdcf38c0-3a92-4c35-85fb-8cd760308083"). InnerVolumeSpecName "kube-api-access-dnc6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.904583 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdcf38c0-3a92-4c35-85fb-8cd760308083" (UID: "fdcf38c0-3a92-4c35-85fb-8cd760308083"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.908067 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-config-data" (OuterVolumeSpecName: "config-data") pod "fdcf38c0-3a92-4c35-85fb-8cd760308083" (UID: "fdcf38c0-3a92-4c35-85fb-8cd760308083"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.937382 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fdcf38c0-3a92-4c35-85fb-8cd760308083" (UID: "fdcf38c0-3a92-4c35-85fb-8cd760308083"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.969329 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnc6z\" (UniqueName: \"kubernetes.io/projected/fdcf38c0-3a92-4c35-85fb-8cd760308083-kube-api-access-dnc6z\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.969362 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.969375 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:14 crc kubenswrapper[4907]: I1009 19:48:14.969386 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcf38c0-3a92-4c35-85fb-8cd760308083-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.043770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325af37-9aae-4a1d-bfc1-b07ccca70588","Type":"ContainerDied","Data":"c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd"} Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.047781 4907 generic.go:334] "Generic (PLEG): container finished" podID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerID="c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd" exitCode=143 Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.064397 4907 generic.go:334] "Generic (PLEG): container finished" podID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerID="65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e" exitCode=0 Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.064431 4907 generic.go:334] "Generic (PLEG): container finished" podID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerID="c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139" exitCode=143 Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.064677 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.066809 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdcf38c0-3a92-4c35-85fb-8cd760308083","Type":"ContainerDied","Data":"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e"} Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.066847 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdcf38c0-3a92-4c35-85fb-8cd760308083","Type":"ContainerDied","Data":"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139"} Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.066861 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fdcf38c0-3a92-4c35-85fb-8cd760308083","Type":"ContainerDied","Data":"cd7e9699e976ce35c4d2164ab36d6bba06e0b999766a9743236f3f07c3a4e4bc"} Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.066877 4907 scope.go:117] "RemoveContainer" containerID="65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.155646 4907 scope.go:117] "RemoveContainer" containerID="c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.179963 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84c9813-0bcd-4c28-aa91-219dec410336" path="/var/lib/kubelet/pods/a84c9813-0bcd-4c28-aa91-219dec410336/volumes" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.180587 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.180608 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.273663 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:15 crc kubenswrapper[4907]: E1009 19:48:15.274896 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerName="nova-metadata-log" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.274931 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerName="nova-metadata-log" Oct 09 19:48:15 crc kubenswrapper[4907]: E1009 19:48:15.274964 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerName="nova-metadata-metadata" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.274970 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerName="nova-metadata-metadata" Oct 09 19:48:15 crc kubenswrapper[4907]: E1009 19:48:15.274994 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3525ecd6-fd9f-47bd-b83d-7eb303d3032c" containerName="nova-manage" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.275003 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3525ecd6-fd9f-47bd-b83d-7eb303d3032c" containerName="nova-manage" Oct 09 19:48:15 crc kubenswrapper[4907]: E1009 19:48:15.275024 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84c9813-0bcd-4c28-aa91-219dec410336" containerName="init" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.275030 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84c9813-0bcd-4c28-aa91-219dec410336" containerName="init" Oct 09 19:48:15 crc kubenswrapper[4907]: E1009 19:48:15.275048 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84c9813-0bcd-4c28-aa91-219dec410336" containerName="dnsmasq-dns" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.275055 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84c9813-0bcd-4c28-aa91-219dec410336" containerName="dnsmasq-dns" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.275404 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerName="nova-metadata-log" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.275430 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" containerName="nova-metadata-metadata" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.275444 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3525ecd6-fd9f-47bd-b83d-7eb303d3032c" containerName="nova-manage" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.275483 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84c9813-0bcd-4c28-aa91-219dec410336" containerName="dnsmasq-dns" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.277076 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.285833 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.288527 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.303976 4907 scope.go:117] "RemoveContainer" containerID="65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e" Oct 09 19:48:15 crc kubenswrapper[4907]: E1009 19:48:15.304976 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e\": container with ID starting with 65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e not found: ID does not exist" containerID="65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.305054 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e"} err="failed to get container status \"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e\": rpc error: code = NotFound desc = could not find container \"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e\": container with ID starting with 65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e not found: ID does not exist" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.305111 4907 scope.go:117] "RemoveContainer" containerID="c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139" Oct 09 19:48:15 crc kubenswrapper[4907]: E1009 19:48:15.305405 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139\": container with ID starting with c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139 not found: ID does not exist" containerID="c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.305568 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139"} err="failed to get container status \"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139\": rpc error: code = NotFound desc = could not find container \"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139\": container with ID starting with c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139 not found: ID does not exist" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.305589 4907 scope.go:117] "RemoveContainer" containerID="65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.305809 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e"} err="failed to get container status \"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e\": rpc error: code = NotFound desc = could not find container \"65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e\": container with ID starting with 65944ad703a5b7bfa4b72829a96788d0e941ddf1a0220307a42c85274290ce8e not found: ID does not exist" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.305827 4907 scope.go:117] "RemoveContainer" containerID="c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.306045 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139"} err="failed to get container status \"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139\": rpc error: code = NotFound desc = could not find container \"c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139\": container with ID starting with c56df44a0e4ac3f02505e4a6f3dee2db7cbc2eab8a012424122fda06a2626139 not found: ID does not exist" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.317266 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.391398 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.392055 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cabc27-dfc9-4030-9508-cd366682d788-logs\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.392105 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx2t5\" (UniqueName: \"kubernetes.io/projected/45cabc27-dfc9-4030-9508-cd366682d788-kube-api-access-hx2t5\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.392161 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-config-data\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.392394 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.494248 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.494326 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.494379 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cabc27-dfc9-4030-9508-cd366682d788-logs\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.494437 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2t5\" (UniqueName: \"kubernetes.io/projected/45cabc27-dfc9-4030-9508-cd366682d788-kube-api-access-hx2t5\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.494526 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-config-data\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.499933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cabc27-dfc9-4030-9508-cd366682d788-logs\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.502540 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-config-data\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.513418 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.515428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.515679 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx2t5\" (UniqueName: \"kubernetes.io/projected/45cabc27-dfc9-4030-9508-cd366682d788-kube-api-access-hx2t5\") pod \"nova-metadata-0\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.634243 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.646457 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.802739 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr2dt\" (UniqueName: \"kubernetes.io/projected/4360d8b6-f761-4e20-acd0-3cb6580dd756-kube-api-access-xr2dt\") pod \"4360d8b6-f761-4e20-acd0-3cb6580dd756\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.803233 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-config-data\") pod \"4360d8b6-f761-4e20-acd0-3cb6580dd756\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.803302 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-combined-ca-bundle\") pod \"4360d8b6-f761-4e20-acd0-3cb6580dd756\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.803429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-scripts\") pod \"4360d8b6-f761-4e20-acd0-3cb6580dd756\" (UID: \"4360d8b6-f761-4e20-acd0-3cb6580dd756\") " Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.808613 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4360d8b6-f761-4e20-acd0-3cb6580dd756-kube-api-access-xr2dt" (OuterVolumeSpecName: "kube-api-access-xr2dt") pod "4360d8b6-f761-4e20-acd0-3cb6580dd756" (UID: "4360d8b6-f761-4e20-acd0-3cb6580dd756"). InnerVolumeSpecName "kube-api-access-xr2dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.813753 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-scripts" (OuterVolumeSpecName: "scripts") pod "4360d8b6-f761-4e20-acd0-3cb6580dd756" (UID: "4360d8b6-f761-4e20-acd0-3cb6580dd756"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.834303 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4360d8b6-f761-4e20-acd0-3cb6580dd756" (UID: "4360d8b6-f761-4e20-acd0-3cb6580dd756"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.847291 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-config-data" (OuterVolumeSpecName: "config-data") pod "4360d8b6-f761-4e20-acd0-3cb6580dd756" (UID: "4360d8b6-f761-4e20-acd0-3cb6580dd756"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.906336 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr2dt\" (UniqueName: \"kubernetes.io/projected/4360d8b6-f761-4e20-acd0-3cb6580dd756-kube-api-access-xr2dt\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.906372 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.906382 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:15 crc kubenswrapper[4907]: I1009 19:48:15.906392 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4360d8b6-f761-4e20-acd0-3cb6580dd756-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.075442 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6v72h" event={"ID":"4360d8b6-f761-4e20-acd0-3cb6580dd756","Type":"ContainerDied","Data":"6d8a6e6d6ef93f5079815d9f9bf22b4c93a7de0fcc4f66ad7374c8ccf71f19bd"} Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.075502 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d8a6e6d6ef93f5079815d9f9bf22b4c93a7de0fcc4f66ad7374c8ccf71f19bd" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.075574 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6v72h" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.100818 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.101372 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" containerName="nova-scheduler-scheduler" containerID="cri-o://275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750" gracePeriod=30 Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.139613 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 19:48:16 crc kubenswrapper[4907]: E1009 19:48:16.140019 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4360d8b6-f761-4e20-acd0-3cb6580dd756" containerName="nova-cell1-conductor-db-sync" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.140034 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4360d8b6-f761-4e20-acd0-3cb6580dd756" containerName="nova-cell1-conductor-db-sync" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.140209 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4360d8b6-f761-4e20-acd0-3cb6580dd756" containerName="nova-cell1-conductor-db-sync" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.140908 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.144932 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.155130 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.312961 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9tww\" (UniqueName: \"kubernetes.io/projected/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-kube-api-access-j9tww\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.313226 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.313271 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.414645 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.414694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.414764 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9tww\" (UniqueName: \"kubernetes.io/projected/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-kube-api-access-j9tww\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.419078 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.422683 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.434291 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9tww\" (UniqueName: \"kubernetes.io/projected/3dbc636b-ea8b-4e61-bce8-2d6aaae5d855-kube-api-access-j9tww\") pod \"nova-cell1-conductor-0\" (UID: \"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855\") " pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.488323 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.902037 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 19:48:16 crc kubenswrapper[4907]: I1009 19:48:16.914537 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 19:48:16 crc kubenswrapper[4907]: W1009 19:48:16.917656 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dbc636b_ea8b_4e61_bce8_2d6aaae5d855.slice/crio-c20565280086ce05ce72de5c1a84e95508070524bfa55d73cde43cb0467a23a9 WatchSource:0}: Error finding container c20565280086ce05ce72de5c1a84e95508070524bfa55d73cde43cb0467a23a9: Status 404 returned error can't find the container with id c20565280086ce05ce72de5c1a84e95508070524bfa55d73cde43cb0467a23a9 Oct 09 19:48:17 crc kubenswrapper[4907]: I1009 19:48:17.117376 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45cabc27-dfc9-4030-9508-cd366682d788","Type":"ContainerStarted","Data":"e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46"} Oct 09 19:48:17 crc kubenswrapper[4907]: I1009 19:48:17.117705 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45cabc27-dfc9-4030-9508-cd366682d788","Type":"ContainerStarted","Data":"2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0"} Oct 09 19:48:17 crc kubenswrapper[4907]: I1009 19:48:17.117716 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45cabc27-dfc9-4030-9508-cd366682d788","Type":"ContainerStarted","Data":"1f0c81a464e1812414290a6378964e92663681bbffaa7ef7db706aa75fe8678c"} Oct 09 19:48:17 crc kubenswrapper[4907]: I1009 19:48:17.120860 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855","Type":"ContainerStarted","Data":"c20565280086ce05ce72de5c1a84e95508070524bfa55d73cde43cb0467a23a9"} Oct 09 19:48:17 crc kubenswrapper[4907]: I1009 19:48:17.138409 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.138389621 podStartE2EDuration="2.138389621s" podCreationTimestamp="2025-10-09 19:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:17.131990326 +0000 UTC m=+1182.663957825" watchObservedRunningTime="2025-10-09 19:48:17.138389621 +0000 UTC m=+1182.670357110" Oct 09 19:48:17 crc kubenswrapper[4907]: I1009 19:48:17.170613 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdcf38c0-3a92-4c35-85fb-8cd760308083" path="/var/lib/kubelet/pods/fdcf38c0-3a92-4c35-85fb-8cd760308083/volumes" Oct 09 19:48:17 crc kubenswrapper[4907]: E1009 19:48:17.389484 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 19:48:17 crc kubenswrapper[4907]: E1009 19:48:17.391519 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 19:48:17 crc kubenswrapper[4907]: E1009 19:48:17.393277 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 19:48:17 crc kubenswrapper[4907]: E1009 19:48:17.393331 4907 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" containerName="nova-scheduler-scheduler" Oct 09 19:48:18 crc kubenswrapper[4907]: I1009 19:48:18.146440 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3dbc636b-ea8b-4e61-bce8-2d6aaae5d855","Type":"ContainerStarted","Data":"0332eee82823474a906e820e77d675e8945ff7f0a5f3a1a9f8aed708016bdfb0"} Oct 09 19:48:18 crc kubenswrapper[4907]: I1009 19:48:18.163167 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.163152677 podStartE2EDuration="2.163152677s" podCreationTimestamp="2025-10-09 19:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:18.161850685 +0000 UTC m=+1183.693818174" watchObservedRunningTime="2025-10-09 19:48:18.163152677 +0000 UTC m=+1183.695120156" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.007175 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.155508 4907 generic.go:334] "Generic (PLEG): container finished" podID="a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" containerID="275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750" exitCode=0 Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.155633 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.163119 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfrb5\" (UniqueName: \"kubernetes.io/projected/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-kube-api-access-wfrb5\") pod \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.163359 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-combined-ca-bundle\") pod \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.163398 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-config-data\") pod \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\" (UID: \"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54\") " Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.166916 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.166951 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54","Type":"ContainerDied","Data":"275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750"} Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.166976 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a1a97754-249a-4f50-ac7a-fd5dd2cb7b54","Type":"ContainerDied","Data":"3813c3e90618c6399f882a461fc1435e959cecfad3ac1b1c226b16e846b27ee7"} Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.166995 4907 scope.go:117] "RemoveContainer" containerID="275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.169110 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-kube-api-access-wfrb5" (OuterVolumeSpecName: "kube-api-access-wfrb5") pod "a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" (UID: "a1a97754-249a-4f50-ac7a-fd5dd2cb7b54"). InnerVolumeSpecName "kube-api-access-wfrb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.192040 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" (UID: "a1a97754-249a-4f50-ac7a-fd5dd2cb7b54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.198511 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-config-data" (OuterVolumeSpecName: "config-data") pod "a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" (UID: "a1a97754-249a-4f50-ac7a-fd5dd2cb7b54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.265660 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.265689 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.265698 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfrb5\" (UniqueName: \"kubernetes.io/projected/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54-kube-api-access-wfrb5\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.271106 4907 scope.go:117] "RemoveContainer" containerID="275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750" Oct 09 19:48:19 crc kubenswrapper[4907]: E1009 19:48:19.271694 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750\": container with ID starting with 275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750 not found: ID does not exist" containerID="275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.271740 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750"} err="failed to get container status \"275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750\": rpc error: code = NotFound desc = could not find container \"275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750\": container with ID starting with 275db8764da64472b3e988acfea42153dd183f2e3ed01acdd28257171706c750 not found: ID does not exist" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.492811 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.505330 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.523859 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:19 crc kubenswrapper[4907]: E1009 19:48:19.524679 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" containerName="nova-scheduler-scheduler" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.524812 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" containerName="nova-scheduler-scheduler" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.525164 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" containerName="nova-scheduler-scheduler" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.526045 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.530935 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.537436 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.674223 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tvpz\" (UniqueName: \"kubernetes.io/projected/c7aa1be1-c204-41b1-8cf2-fc77138a5673-kube-api-access-7tvpz\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.674279 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-config-data\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.674320 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.775881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tvpz\" (UniqueName: \"kubernetes.io/projected/c7aa1be1-c204-41b1-8cf2-fc77138a5673-kube-api-access-7tvpz\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.775940 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-config-data\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.775991 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.782216 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.785007 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-config-data\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.799455 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tvpz\" (UniqueName: \"kubernetes.io/projected/c7aa1be1-c204-41b1-8cf2-fc77138a5673-kube-api-access-7tvpz\") pod \"nova-scheduler-0\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " pod="openstack/nova-scheduler-0" Oct 09 19:48:19 crc kubenswrapper[4907]: I1009 19:48:19.849554 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.063803 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.168541 4907 generic.go:334] "Generic (PLEG): container finished" podID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerID="1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34" exitCode=0 Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.168611 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325af37-9aae-4a1d-bfc1-b07ccca70588","Type":"ContainerDied","Data":"1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34"} Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.168637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5325af37-9aae-4a1d-bfc1-b07ccca70588","Type":"ContainerDied","Data":"7cf038b6d89d437e33d1defeace4e0e6c99f45bfe3c2317f35da26ae07c774d0"} Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.168642 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.168655 4907 scope.go:117] "RemoveContainer" containerID="1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.183730 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325af37-9aae-4a1d-bfc1-b07ccca70588-logs\") pod \"5325af37-9aae-4a1d-bfc1-b07ccca70588\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.183861 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-config-data\") pod \"5325af37-9aae-4a1d-bfc1-b07ccca70588\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.184193 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5325af37-9aae-4a1d-bfc1-b07ccca70588-logs" (OuterVolumeSpecName: "logs") pod "5325af37-9aae-4a1d-bfc1-b07ccca70588" (UID: "5325af37-9aae-4a1d-bfc1-b07ccca70588"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.184687 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-combined-ca-bundle\") pod \"5325af37-9aae-4a1d-bfc1-b07ccca70588\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.185432 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78pv8\" (UniqueName: \"kubernetes.io/projected/5325af37-9aae-4a1d-bfc1-b07ccca70588-kube-api-access-78pv8\") pod \"5325af37-9aae-4a1d-bfc1-b07ccca70588\" (UID: \"5325af37-9aae-4a1d-bfc1-b07ccca70588\") " Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.186020 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5325af37-9aae-4a1d-bfc1-b07ccca70588-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.189427 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5325af37-9aae-4a1d-bfc1-b07ccca70588-kube-api-access-78pv8" (OuterVolumeSpecName: "kube-api-access-78pv8") pod "5325af37-9aae-4a1d-bfc1-b07ccca70588" (UID: "5325af37-9aae-4a1d-bfc1-b07ccca70588"). InnerVolumeSpecName "kube-api-access-78pv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.213358 4907 scope.go:117] "RemoveContainer" containerID="c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.229574 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-config-data" (OuterVolumeSpecName: "config-data") pod "5325af37-9aae-4a1d-bfc1-b07ccca70588" (UID: "5325af37-9aae-4a1d-bfc1-b07ccca70588"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.229610 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5325af37-9aae-4a1d-bfc1-b07ccca70588" (UID: "5325af37-9aae-4a1d-bfc1-b07ccca70588"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.232677 4907 scope.go:117] "RemoveContainer" containerID="1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34" Oct 09 19:48:20 crc kubenswrapper[4907]: E1009 19:48:20.233093 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34\": container with ID starting with 1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34 not found: ID does not exist" containerID="1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.233131 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34"} err="failed to get container status \"1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34\": rpc error: code = NotFound desc = could not find container \"1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34\": container with ID starting with 1e628c3f45a66d26774047aa8e6f214dcea7a1562ea1f39eeef5979cb43edb34 not found: ID does not exist" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.233156 4907 scope.go:117] "RemoveContainer" containerID="c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd" Oct 09 19:48:20 crc kubenswrapper[4907]: E1009 19:48:20.233403 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd\": container with ID starting with c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd not found: ID does not exist" containerID="c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.233423 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd"} err="failed to get container status \"c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd\": rpc error: code = NotFound desc = could not find container \"c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd\": container with ID starting with c46f3eeabd4f7f165cc8da0bfc1b5a0dbeac5072f037768df4debd83e19d67cd not found: ID does not exist" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.287598 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.287638 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5325af37-9aae-4a1d-bfc1-b07ccca70588-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.287654 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78pv8\" (UniqueName: \"kubernetes.io/projected/5325af37-9aae-4a1d-bfc1-b07ccca70588-kube-api-access-78pv8\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.369327 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:20 crc kubenswrapper[4907]: W1009 19:48:20.375086 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7aa1be1_c204_41b1_8cf2_fc77138a5673.slice/crio-6a12607c001f875429ebd2518d22a311d600ee2562d3e6e336403c97e7c3bb26 WatchSource:0}: Error finding container 6a12607c001f875429ebd2518d22a311d600ee2562d3e6e336403c97e7c3bb26: Status 404 returned error can't find the container with id 6a12607c001f875429ebd2518d22a311d600ee2562d3e6e336403c97e7c3bb26 Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.489072 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.489591 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="553f4f9e-5f34-4731-898c-4f0cacf4b545" containerName="kube-state-metrics" containerID="cri-o://c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f" gracePeriod=30 Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.513898 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.523320 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.541219 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:20 crc kubenswrapper[4907]: E1009 19:48:20.541726 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-api" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.541751 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-api" Oct 09 19:48:20 crc kubenswrapper[4907]: E1009 19:48:20.541792 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-log" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.541801 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-log" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.542038 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-api" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.542070 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" containerName="nova-api-log" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.543299 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.545736 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.552811 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.635292 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.635348 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.693354 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.693398 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ecf8b6-cb47-4096-85e1-4286f45529db-logs\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.693442 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-config-data\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.693508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9w99\" (UniqueName: \"kubernetes.io/projected/c4ecf8b6-cb47-4096-85e1-4286f45529db-kube-api-access-c9w99\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.795661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9w99\" (UniqueName: \"kubernetes.io/projected/c4ecf8b6-cb47-4096-85e1-4286f45529db-kube-api-access-c9w99\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.795905 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.795936 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ecf8b6-cb47-4096-85e1-4286f45529db-logs\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.795989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-config-data\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.799042 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ecf8b6-cb47-4096-85e1-4286f45529db-logs\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.810321 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-config-data\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.824826 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9w99\" (UniqueName: \"kubernetes.io/projected/c4ecf8b6-cb47-4096-85e1-4286f45529db-kube-api-access-c9w99\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.826266 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " pod="openstack/nova-api-0" Oct 09 19:48:20 crc kubenswrapper[4907]: I1009 19:48:20.907325 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.055990 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.168818 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5325af37-9aae-4a1d-bfc1-b07ccca70588" path="/var/lib/kubelet/pods/5325af37-9aae-4a1d-bfc1-b07ccca70588/volumes" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.170128 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a97754-249a-4f50-ac7a-fd5dd2cb7b54" path="/var/lib/kubelet/pods/a1a97754-249a-4f50-ac7a-fd5dd2cb7b54/volumes" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.181317 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7aa1be1-c204-41b1-8cf2-fc77138a5673","Type":"ContainerStarted","Data":"baacca85ab4e811e75f032a0475f9d7ef5b20f0a2d5ecd62f8ea4b01a7a02794"} Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.181356 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7aa1be1-c204-41b1-8cf2-fc77138a5673","Type":"ContainerStarted","Data":"6a12607c001f875429ebd2518d22a311d600ee2562d3e6e336403c97e7c3bb26"} Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.183792 4907 generic.go:334] "Generic (PLEG): container finished" podID="553f4f9e-5f34-4731-898c-4f0cacf4b545" containerID="c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f" exitCode=2 Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.183850 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"553f4f9e-5f34-4731-898c-4f0cacf4b545","Type":"ContainerDied","Data":"c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f"} Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.183870 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"553f4f9e-5f34-4731-898c-4f0cacf4b545","Type":"ContainerDied","Data":"eabef965e504f2b7d6b8888ac724316d9fcede217a9dc364cfaad9ef5b4d3183"} Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.183886 4907 scope.go:117] "RemoveContainer" containerID="c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.183894 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.199630 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.199614126 podStartE2EDuration="2.199614126s" podCreationTimestamp="2025-10-09 19:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:21.191175151 +0000 UTC m=+1186.723142650" watchObservedRunningTime="2025-10-09 19:48:21.199614126 +0000 UTC m=+1186.731581615" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.202541 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpmtg\" (UniqueName: \"kubernetes.io/projected/553f4f9e-5f34-4731-898c-4f0cacf4b545-kube-api-access-kpmtg\") pod \"553f4f9e-5f34-4731-898c-4f0cacf4b545\" (UID: \"553f4f9e-5f34-4731-898c-4f0cacf4b545\") " Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.209721 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553f4f9e-5f34-4731-898c-4f0cacf4b545-kube-api-access-kpmtg" (OuterVolumeSpecName: "kube-api-access-kpmtg") pod "553f4f9e-5f34-4731-898c-4f0cacf4b545" (UID: "553f4f9e-5f34-4731-898c-4f0cacf4b545"). InnerVolumeSpecName "kube-api-access-kpmtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.218708 4907 scope.go:117] "RemoveContainer" containerID="c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f" Oct 09 19:48:21 crc kubenswrapper[4907]: E1009 19:48:21.219269 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f\": container with ID starting with c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f not found: ID does not exist" containerID="c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.219306 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f"} err="failed to get container status \"c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f\": rpc error: code = NotFound desc = could not find container \"c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f\": container with ID starting with c0109e0ad33edd7ea9a2ff8966ce10c2853b1b513364a59e0eb10f2c1e377a0f not found: ID does not exist" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.305840 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpmtg\" (UniqueName: \"kubernetes.io/projected/553f4f9e-5f34-4731-898c-4f0cacf4b545-kube-api-access-kpmtg\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:21 crc kubenswrapper[4907]: W1009 19:48:21.427860 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4ecf8b6_cb47_4096_85e1_4286f45529db.slice/crio-89857638abda854439a41ca595d71dfadf3440c8b5e87217a1241441fc6e7eb5 WatchSource:0}: Error finding container 89857638abda854439a41ca595d71dfadf3440c8b5e87217a1241441fc6e7eb5: Status 404 returned error can't find the container with id 89857638abda854439a41ca595d71dfadf3440c8b5e87217a1241441fc6e7eb5 Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.434397 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.523940 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.537709 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.553197 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:48:21 crc kubenswrapper[4907]: E1009 19:48:21.553732 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553f4f9e-5f34-4731-898c-4f0cacf4b545" containerName="kube-state-metrics" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.553759 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="553f4f9e-5f34-4731-898c-4f0cacf4b545" containerName="kube-state-metrics" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.554004 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="553f4f9e-5f34-4731-898c-4f0cacf4b545" containerName="kube-state-metrics" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.554688 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.559531 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.559576 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.563152 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.712615 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.712767 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.712787 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmftr\" (UniqueName: \"kubernetes.io/projected/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-api-access-pmftr\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.712811 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.814804 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.815060 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.815088 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmftr\" (UniqueName: \"kubernetes.io/projected/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-api-access-pmftr\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.815125 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.823646 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.823818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.823931 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.839753 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmftr\" (UniqueName: \"kubernetes.io/projected/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-api-access-pmftr\") pod \"kube-state-metrics-0\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " pod="openstack/kube-state-metrics-0" Oct 09 19:48:21 crc kubenswrapper[4907]: I1009 19:48:21.871842 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.222053 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4ecf8b6-cb47-4096-85e1-4286f45529db","Type":"ContainerStarted","Data":"4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214"} Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.222107 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4ecf8b6-cb47-4096-85e1-4286f45529db","Type":"ContainerStarted","Data":"157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988"} Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.222123 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4ecf8b6-cb47-4096-85e1-4286f45529db","Type":"ContainerStarted","Data":"89857638abda854439a41ca595d71dfadf3440c8b5e87217a1241441fc6e7eb5"} Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.245514 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.245492783 podStartE2EDuration="2.245492783s" podCreationTimestamp="2025-10-09 19:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:22.245180455 +0000 UTC m=+1187.777147964" watchObservedRunningTime="2025-10-09 19:48:22.245492783 +0000 UTC m=+1187.777460272" Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.497941 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 19:48:22 crc kubenswrapper[4907]: W1009 19:48:22.511283 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7338c5b1_5214_40d6_a82c_f1f53697a06a.slice/crio-e8446bd6d5315098ed32e6e0dbe2ce2d9374ee4278b03558f46f39bfcdaf3e6a WatchSource:0}: Error finding container e8446bd6d5315098ed32e6e0dbe2ce2d9374ee4278b03558f46f39bfcdaf3e6a: Status 404 returned error can't find the container with id e8446bd6d5315098ed32e6e0dbe2ce2d9374ee4278b03558f46f39bfcdaf3e6a Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.637581 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.638413 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="ceilometer-central-agent" containerID="cri-o://564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa" gracePeriod=30 Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.638501 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="proxy-httpd" containerID="cri-o://eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075" gracePeriod=30 Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.638546 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="sg-core" containerID="cri-o://515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558" gracePeriod=30 Oct 09 19:48:22 crc kubenswrapper[4907]: I1009 19:48:22.638564 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="ceilometer-notification-agent" containerID="cri-o://2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27" gracePeriod=30 Oct 09 19:48:23 crc kubenswrapper[4907]: I1009 19:48:23.176489 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553f4f9e-5f34-4731-898c-4f0cacf4b545" path="/var/lib/kubelet/pods/553f4f9e-5f34-4731-898c-4f0cacf4b545/volumes" Oct 09 19:48:23 crc kubenswrapper[4907]: I1009 19:48:23.249196 4907 generic.go:334] "Generic (PLEG): container finished" podID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerID="eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075" exitCode=0 Oct 09 19:48:23 crc kubenswrapper[4907]: I1009 19:48:23.249252 4907 generic.go:334] "Generic (PLEG): container finished" podID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerID="515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558" exitCode=2 Oct 09 19:48:23 crc kubenswrapper[4907]: I1009 19:48:23.249267 4907 generic.go:334] "Generic (PLEG): container finished" podID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerID="564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa" exitCode=0 Oct 09 19:48:23 crc kubenswrapper[4907]: I1009 19:48:23.249341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerDied","Data":"eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075"} Oct 09 19:48:23 crc kubenswrapper[4907]: I1009 19:48:23.249375 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerDied","Data":"515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558"} Oct 09 19:48:23 crc kubenswrapper[4907]: I1009 19:48:23.249391 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerDied","Data":"564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa"} Oct 09 19:48:23 crc kubenswrapper[4907]: I1009 19:48:23.252431 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7338c5b1-5214-40d6-a82c-f1f53697a06a","Type":"ContainerStarted","Data":"e8446bd6d5315098ed32e6e0dbe2ce2d9374ee4278b03558f46f39bfcdaf3e6a"} Oct 09 19:48:24 crc kubenswrapper[4907]: I1009 19:48:24.271292 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7338c5b1-5214-40d6-a82c-f1f53697a06a","Type":"ContainerStarted","Data":"dee68b40af7e691a836abbf7a15b3773a698f5c1d80eda642569a7ba1377ce7b"} Oct 09 19:48:24 crc kubenswrapper[4907]: I1009 19:48:24.271650 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 19:48:24 crc kubenswrapper[4907]: I1009 19:48:24.289143 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.768921128 podStartE2EDuration="3.2891251s" podCreationTimestamp="2025-10-09 19:48:21 +0000 UTC" firstStartedPulling="2025-10-09 19:48:22.513387522 +0000 UTC m=+1188.045355011" lastFinishedPulling="2025-10-09 19:48:23.033591494 +0000 UTC m=+1188.565558983" observedRunningTime="2025-10-09 19:48:24.285481241 +0000 UTC m=+1189.817448740" watchObservedRunningTime="2025-10-09 19:48:24.2891251 +0000 UTC m=+1189.821092599" Oct 09 19:48:24 crc kubenswrapper[4907]: I1009 19:48:24.850189 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 19:48:25 crc kubenswrapper[4907]: I1009 19:48:25.635388 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 19:48:25 crc kubenswrapper[4907]: I1009 19:48:25.635843 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.155945 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.292333 4907 generic.go:334] "Generic (PLEG): container finished" podID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerID="2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27" exitCode=0 Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.292382 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerDied","Data":"2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27"} Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.292415 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a805e428-d0d8-423c-b0c7-f8d7bbd31408","Type":"ContainerDied","Data":"69bf4531f3d22e07ddf0786ca22a324f91e3838628c17a1b1a73644f178c4671"} Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.292438 4907 scope.go:117] "RemoveContainer" containerID="eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.292633 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.300284 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-scripts\") pod \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.300428 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-config-data\") pod \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.300488 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-sg-core-conf-yaml\") pod \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.300546 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fnsv\" (UniqueName: \"kubernetes.io/projected/a805e428-d0d8-423c-b0c7-f8d7bbd31408-kube-api-access-5fnsv\") pod \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.300599 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-run-httpd\") pod \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.300627 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-log-httpd\") pod \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.300764 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-combined-ca-bundle\") pod \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\" (UID: \"a805e428-d0d8-423c-b0c7-f8d7bbd31408\") " Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.302014 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a805e428-d0d8-423c-b0c7-f8d7bbd31408" (UID: "a805e428-d0d8-423c-b0c7-f8d7bbd31408"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.302246 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a805e428-d0d8-423c-b0c7-f8d7bbd31408" (UID: "a805e428-d0d8-423c-b0c7-f8d7bbd31408"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.310796 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-scripts" (OuterVolumeSpecName: "scripts") pod "a805e428-d0d8-423c-b0c7-f8d7bbd31408" (UID: "a805e428-d0d8-423c-b0c7-f8d7bbd31408"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.325308 4907 scope.go:117] "RemoveContainer" containerID="515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.325522 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a805e428-d0d8-423c-b0c7-f8d7bbd31408-kube-api-access-5fnsv" (OuterVolumeSpecName: "kube-api-access-5fnsv") pod "a805e428-d0d8-423c-b0c7-f8d7bbd31408" (UID: "a805e428-d0d8-423c-b0c7-f8d7bbd31408"). InnerVolumeSpecName "kube-api-access-5fnsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.331807 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a805e428-d0d8-423c-b0c7-f8d7bbd31408" (UID: "a805e428-d0d8-423c-b0c7-f8d7bbd31408"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.399959 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a805e428-d0d8-423c-b0c7-f8d7bbd31408" (UID: "a805e428-d0d8-423c-b0c7-f8d7bbd31408"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.403161 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.403196 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.403206 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.403216 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fnsv\" (UniqueName: \"kubernetes.io/projected/a805e428-d0d8-423c-b0c7-f8d7bbd31408-kube-api-access-5fnsv\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.403226 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.403234 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a805e428-d0d8-423c-b0c7-f8d7bbd31408-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.439826 4907 scope.go:117] "RemoveContainer" containerID="2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.442764 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-config-data" (OuterVolumeSpecName: "config-data") pod "a805e428-d0d8-423c-b0c7-f8d7bbd31408" (UID: "a805e428-d0d8-423c-b0c7-f8d7bbd31408"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.463114 4907 scope.go:117] "RemoveContainer" containerID="564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.490047 4907 scope.go:117] "RemoveContainer" containerID="eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075" Oct 09 19:48:26 crc kubenswrapper[4907]: E1009 19:48:26.493771 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075\": container with ID starting with eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075 not found: ID does not exist" containerID="eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.493811 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075"} err="failed to get container status \"eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075\": rpc error: code = NotFound desc = could not find container \"eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075\": container with ID starting with eedd728c8666e16f85357994525defa49f317b8b9588d32cca1b54cc4c340075 not found: ID does not exist" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.493836 4907 scope.go:117] "RemoveContainer" containerID="515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558" Oct 09 19:48:26 crc kubenswrapper[4907]: E1009 19:48:26.495240 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558\": container with ID starting with 515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558 not found: ID does not exist" containerID="515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.495283 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558"} err="failed to get container status \"515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558\": rpc error: code = NotFound desc = could not find container \"515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558\": container with ID starting with 515968f39c437c71be7219c73e76cd1d0715b36156557ddd4bd29e04785a9558 not found: ID does not exist" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.495313 4907 scope.go:117] "RemoveContainer" containerID="2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27" Oct 09 19:48:26 crc kubenswrapper[4907]: E1009 19:48:26.495777 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27\": container with ID starting with 2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27 not found: ID does not exist" containerID="2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.495965 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27"} err="failed to get container status \"2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27\": rpc error: code = NotFound desc = could not find container \"2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27\": container with ID starting with 2d29a3e7e67bbadd46af5590927722235ff2f00140af21ab5b1459a13d559b27 not found: ID does not exist" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.496110 4907 scope.go:117] "RemoveContainer" containerID="564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa" Oct 09 19:48:26 crc kubenswrapper[4907]: E1009 19:48:26.496738 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa\": container with ID starting with 564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa not found: ID does not exist" containerID="564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.496783 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa"} err="failed to get container status \"564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa\": rpc error: code = NotFound desc = could not find container \"564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa\": container with ID starting with 564efc7d8759ef8440744cea6a50549ccc279e4eee3c6f4433724e5f5cfe66aa not found: ID does not exist" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.505244 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a805e428-d0d8-423c-b0c7-f8d7bbd31408-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.520256 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.627109 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.637248 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.648703 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.649059 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.656635 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:26 crc kubenswrapper[4907]: E1009 19:48:26.657008 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="ceilometer-central-agent" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.657022 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="ceilometer-central-agent" Oct 09 19:48:26 crc kubenswrapper[4907]: E1009 19:48:26.657037 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="proxy-httpd" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.657046 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="proxy-httpd" Oct 09 19:48:26 crc kubenswrapper[4907]: E1009 19:48:26.657059 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="sg-core" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.657097 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="sg-core" Oct 09 19:48:26 crc kubenswrapper[4907]: E1009 19:48:26.657110 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="ceilometer-notification-agent" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.657116 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="ceilometer-notification-agent" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.657302 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="proxy-httpd" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.657315 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="sg-core" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.657322 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="ceilometer-central-agent" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.657336 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" containerName="ceilometer-notification-agent" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.659032 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.661242 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.661381 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.664233 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.671591 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.810027 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.810169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29lpf\" (UniqueName: \"kubernetes.io/projected/e4ffbb8c-37f8-4037-88ac-bfda8292a209-kube-api-access-29lpf\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.810394 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-run-httpd\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.810541 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.810608 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-scripts\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.810641 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-log-httpd\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.810806 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-config-data\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.810881 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.912587 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-config-data\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.912665 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.912769 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.912802 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29lpf\" (UniqueName: \"kubernetes.io/projected/e4ffbb8c-37f8-4037-88ac-bfda8292a209-kube-api-access-29lpf\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.912867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-run-httpd\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.912912 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.912940 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-scripts\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.912957 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-log-httpd\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.913294 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-log-httpd\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.913548 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-run-httpd\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.916266 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.916616 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.917182 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.917676 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-config-data\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.918021 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-scripts\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.930878 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29lpf\" (UniqueName: \"kubernetes.io/projected/e4ffbb8c-37f8-4037-88ac-bfda8292a209-kube-api-access-29lpf\") pod \"ceilometer-0\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " pod="openstack/ceilometer-0" Oct 09 19:48:26 crc kubenswrapper[4907]: I1009 19:48:26.988602 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:48:27 crc kubenswrapper[4907]: I1009 19:48:27.165545 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a805e428-d0d8-423c-b0c7-f8d7bbd31408" path="/var/lib/kubelet/pods/a805e428-d0d8-423c-b0c7-f8d7bbd31408/volumes" Oct 09 19:48:27 crc kubenswrapper[4907]: I1009 19:48:27.460318 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:27 crc kubenswrapper[4907]: W1009 19:48:27.465651 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4ffbb8c_37f8_4037_88ac_bfda8292a209.slice/crio-6dbad8e1158d10f39e9a611211ac4d91fc200646fde97a1df11d38e451d6525b WatchSource:0}: Error finding container 6dbad8e1158d10f39e9a611211ac4d91fc200646fde97a1df11d38e451d6525b: Status 404 returned error can't find the container with id 6dbad8e1158d10f39e9a611211ac4d91fc200646fde97a1df11d38e451d6525b Oct 09 19:48:28 crc kubenswrapper[4907]: I1009 19:48:28.313203 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerStarted","Data":"6dbad8e1158d10f39e9a611211ac4d91fc200646fde97a1df11d38e451d6525b"} Oct 09 19:48:29 crc kubenswrapper[4907]: I1009 19:48:29.323993 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerStarted","Data":"8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053"} Oct 09 19:48:29 crc kubenswrapper[4907]: I1009 19:48:29.850067 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 19:48:29 crc kubenswrapper[4907]: I1009 19:48:29.879211 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 19:48:30 crc kubenswrapper[4907]: I1009 19:48:30.337864 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerStarted","Data":"12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3"} Oct 09 19:48:30 crc kubenswrapper[4907]: I1009 19:48:30.372027 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 19:48:30 crc kubenswrapper[4907]: I1009 19:48:30.908450 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 19:48:30 crc kubenswrapper[4907]: I1009 19:48:30.908839 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 19:48:31 crc kubenswrapper[4907]: I1009 19:48:31.361684 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerStarted","Data":"5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54"} Oct 09 19:48:31 crc kubenswrapper[4907]: I1009 19:48:31.885998 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 19:48:31 crc kubenswrapper[4907]: I1009 19:48:31.992623 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 19:48:31 crc kubenswrapper[4907]: I1009 19:48:31.992623 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 19:48:33 crc kubenswrapper[4907]: I1009 19:48:33.385137 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerStarted","Data":"1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c"} Oct 09 19:48:33 crc kubenswrapper[4907]: I1009 19:48:33.385768 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 19:48:33 crc kubenswrapper[4907]: I1009 19:48:33.410490 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.610857375 podStartE2EDuration="7.410449886s" podCreationTimestamp="2025-10-09 19:48:26 +0000 UTC" firstStartedPulling="2025-10-09 19:48:27.468267385 +0000 UTC m=+1193.000234874" lastFinishedPulling="2025-10-09 19:48:32.267859896 +0000 UTC m=+1197.799827385" observedRunningTime="2025-10-09 19:48:33.407431143 +0000 UTC m=+1198.939398642" watchObservedRunningTime="2025-10-09 19:48:33.410449886 +0000 UTC m=+1198.942417385" Oct 09 19:48:35 crc kubenswrapper[4907]: I1009 19:48:35.646708 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 19:48:35 crc kubenswrapper[4907]: I1009 19:48:35.649318 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 19:48:35 crc kubenswrapper[4907]: I1009 19:48:35.660736 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 19:48:36 crc kubenswrapper[4907]: I1009 19:48:36.417756 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.433344 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.435442 4907 generic.go:334] "Generic (PLEG): container finished" podID="b7705908-7b6b-4a07-91bd-1d2e03bd8e22" containerID="53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc" exitCode=137 Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.435542 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7705908-7b6b-4a07-91bd-1d2e03bd8e22","Type":"ContainerDied","Data":"53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc"} Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.435640 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7705908-7b6b-4a07-91bd-1d2e03bd8e22","Type":"ContainerDied","Data":"4f51253f10019ca3fca92d95fbd69047ab78a84247e82b8c693c70abf5e5ccbf"} Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.435670 4907 scope.go:117] "RemoveContainer" containerID="53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.461244 4907 scope.go:117] "RemoveContainer" containerID="53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc" Oct 09 19:48:38 crc kubenswrapper[4907]: E1009 19:48:38.463062 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc\": container with ID starting with 53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc not found: ID does not exist" containerID="53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.463120 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc"} err="failed to get container status \"53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc\": rpc error: code = NotFound desc = could not find container \"53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc\": container with ID starting with 53c9257d853565e41973232d3b0f6be2d5e0d3f5f1681773521b67781fb6a6cc not found: ID does not exist" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.538016 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-combined-ca-bundle\") pod \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.538241 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zd7z\" (UniqueName: \"kubernetes.io/projected/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-kube-api-access-7zd7z\") pod \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.538453 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-config-data\") pod \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\" (UID: \"b7705908-7b6b-4a07-91bd-1d2e03bd8e22\") " Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.544982 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-kube-api-access-7zd7z" (OuterVolumeSpecName: "kube-api-access-7zd7z") pod "b7705908-7b6b-4a07-91bd-1d2e03bd8e22" (UID: "b7705908-7b6b-4a07-91bd-1d2e03bd8e22"). InnerVolumeSpecName "kube-api-access-7zd7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.576953 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7705908-7b6b-4a07-91bd-1d2e03bd8e22" (UID: "b7705908-7b6b-4a07-91bd-1d2e03bd8e22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.586778 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-config-data" (OuterVolumeSpecName: "config-data") pod "b7705908-7b6b-4a07-91bd-1d2e03bd8e22" (UID: "b7705908-7b6b-4a07-91bd-1d2e03bd8e22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.641234 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zd7z\" (UniqueName: \"kubernetes.io/projected/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-kube-api-access-7zd7z\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.641271 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:38 crc kubenswrapper[4907]: I1009 19:48:38.641285 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7705908-7b6b-4a07-91bd-1d2e03bd8e22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.444729 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.468418 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.476103 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.497743 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:39 crc kubenswrapper[4907]: E1009 19:48:39.498186 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7705908-7b6b-4a07-91bd-1d2e03bd8e22" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.498201 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7705908-7b6b-4a07-91bd-1d2e03bd8e22" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.498441 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7705908-7b6b-4a07-91bd-1d2e03bd8e22" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.499183 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.501586 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.501616 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.501925 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.532152 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.558733 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.559109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.559244 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.559956 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfp2h\" (UniqueName: \"kubernetes.io/projected/01275002-ecaa-441e-b1a1-035dd770cb1d-kube-api-access-bfp2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.560353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.662252 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.662400 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.662446 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.662489 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfp2h\" (UniqueName: \"kubernetes.io/projected/01275002-ecaa-441e-b1a1-035dd770cb1d-kube-api-access-bfp2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.662544 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.668115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.668855 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.669235 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.669849 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01275002-ecaa-441e-b1a1-035dd770cb1d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.681419 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfp2h\" (UniqueName: \"kubernetes.io/projected/01275002-ecaa-441e-b1a1-035dd770cb1d-kube-api-access-bfp2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"01275002-ecaa-441e-b1a1-035dd770cb1d\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:39 crc kubenswrapper[4907]: I1009 19:48:39.825844 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:40 crc kubenswrapper[4907]: I1009 19:48:40.355034 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 19:48:40 crc kubenswrapper[4907]: I1009 19:48:40.459972 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"01275002-ecaa-441e-b1a1-035dd770cb1d","Type":"ContainerStarted","Data":"436c404cce5d3531870b5ea3771e2861451840c2617bfb06b0bbf7b71a6b13fa"} Oct 09 19:48:40 crc kubenswrapper[4907]: I1009 19:48:40.914052 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 19:48:40 crc kubenswrapper[4907]: I1009 19:48:40.914781 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 19:48:40 crc kubenswrapper[4907]: I1009 19:48:40.916363 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 19:48:40 crc kubenswrapper[4907]: I1009 19:48:40.920321 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.165427 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7705908-7b6b-4a07-91bd-1d2e03bd8e22" path="/var/lib/kubelet/pods/b7705908-7b6b-4a07-91bd-1d2e03bd8e22/volumes" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.481707 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"01275002-ecaa-441e-b1a1-035dd770cb1d","Type":"ContainerStarted","Data":"66973a8ff99274774c7b26b4604b7990a35b101b8ac23643e2620099c51178ca"} Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.481791 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.487639 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.528811 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.528787934 podStartE2EDuration="2.528787934s" podCreationTimestamp="2025-10-09 19:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:41.525297469 +0000 UTC m=+1207.057265028" watchObservedRunningTime="2025-10-09 19:48:41.528787934 +0000 UTC m=+1207.060755423" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.720929 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-br6lt"] Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.727734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.737261 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-br6lt"] Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.807612 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.807684 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg29m\" (UniqueName: \"kubernetes.io/projected/c47a2842-7ddb-4676-9d20-f507044a2b76-kube-api-access-sg29m\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.807718 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.807753 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-config\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.807792 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.807869 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.909404 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.909677 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg29m\" (UniqueName: \"kubernetes.io/projected/c47a2842-7ddb-4676-9d20-f507044a2b76-kube-api-access-sg29m\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.909763 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.909895 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-config\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.910006 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.910157 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.910865 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.910874 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-config\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.910912 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.910969 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.911246 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:41 crc kubenswrapper[4907]: I1009 19:48:41.929054 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg29m\" (UniqueName: \"kubernetes.io/projected/c47a2842-7ddb-4676-9d20-f507044a2b76-kube-api-access-sg29m\") pod \"dnsmasq-dns-59cf4bdb65-br6lt\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:42 crc kubenswrapper[4907]: I1009 19:48:42.052476 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:42 crc kubenswrapper[4907]: W1009 19:48:42.615802 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc47a2842_7ddb_4676_9d20_f507044a2b76.slice/crio-1a71ecf73900304c3f3e68bfbc6138f241f689ee1902990e6200017ab3aefed1 WatchSource:0}: Error finding container 1a71ecf73900304c3f3e68bfbc6138f241f689ee1902990e6200017ab3aefed1: Status 404 returned error can't find the container with id 1a71ecf73900304c3f3e68bfbc6138f241f689ee1902990e6200017ab3aefed1 Oct 09 19:48:42 crc kubenswrapper[4907]: I1009 19:48:42.622401 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-br6lt"] Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.503828 4907 generic.go:334] "Generic (PLEG): container finished" podID="c47a2842-7ddb-4676-9d20-f507044a2b76" containerID="33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533" exitCode=0 Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.503884 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" event={"ID":"c47a2842-7ddb-4676-9d20-f507044a2b76","Type":"ContainerDied","Data":"33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533"} Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.504291 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" event={"ID":"c47a2842-7ddb-4676-9d20-f507044a2b76","Type":"ContainerStarted","Data":"1a71ecf73900304c3f3e68bfbc6138f241f689ee1902990e6200017ab3aefed1"} Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.769074 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.769734 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="ceilometer-central-agent" containerID="cri-o://8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053" gracePeriod=30 Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.769889 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="proxy-httpd" containerID="cri-o://1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c" gracePeriod=30 Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.769952 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="sg-core" containerID="cri-o://5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54" gracePeriod=30 Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.769999 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="ceilometer-notification-agent" containerID="cri-o://12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3" gracePeriod=30 Oct 09 19:48:43 crc kubenswrapper[4907]: I1009 19:48:43.773792 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.201:3000/\": read tcp 10.217.0.2:38494->10.217.0.201:3000: read: connection reset by peer" Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.378830 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.516202 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" event={"ID":"c47a2842-7ddb-4676-9d20-f507044a2b76","Type":"ContainerStarted","Data":"f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773"} Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.517341 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.521344 4907 generic.go:334] "Generic (PLEG): container finished" podID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerID="1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c" exitCode=0 Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.521398 4907 generic.go:334] "Generic (PLEG): container finished" podID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerID="5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54" exitCode=2 Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.521414 4907 generic.go:334] "Generic (PLEG): container finished" podID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerID="8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053" exitCode=0 Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.521442 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerDied","Data":"1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c"} Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.521507 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerDied","Data":"5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54"} Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.521521 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerDied","Data":"8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053"} Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.521798 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-log" containerID="cri-o://157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988" gracePeriod=30 Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.522080 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-api" containerID="cri-o://4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214" gracePeriod=30 Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.546906 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" podStartSLOduration=3.5468890379999998 podStartE2EDuration="3.546889038s" podCreationTimestamp="2025-10-09 19:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:44.53789241 +0000 UTC m=+1210.069859919" watchObservedRunningTime="2025-10-09 19:48:44.546889038 +0000 UTC m=+1210.078856517" Oct 09 19:48:44 crc kubenswrapper[4907]: I1009 19:48:44.826035 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:45 crc kubenswrapper[4907]: I1009 19:48:45.537824 4907 generic.go:334] "Generic (PLEG): container finished" podID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerID="157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988" exitCode=143 Oct 09 19:48:45 crc kubenswrapper[4907]: I1009 19:48:45.537877 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4ecf8b6-cb47-4096-85e1-4286f45529db","Type":"ContainerDied","Data":"157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988"} Oct 09 19:48:45 crc kubenswrapper[4907]: I1009 19:48:45.969089 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.089415 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-combined-ca-bundle\") pod \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.089477 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-config-data\") pod \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.089519 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-ceilometer-tls-certs\") pod \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.089626 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29lpf\" (UniqueName: \"kubernetes.io/projected/e4ffbb8c-37f8-4037-88ac-bfda8292a209-kube-api-access-29lpf\") pod \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.089666 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-sg-core-conf-yaml\") pod \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.089700 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-run-httpd\") pod \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.089755 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-scripts\") pod \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.089858 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-log-httpd\") pod \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\" (UID: \"e4ffbb8c-37f8-4037-88ac-bfda8292a209\") " Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.090633 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4ffbb8c-37f8-4037-88ac-bfda8292a209" (UID: "e4ffbb8c-37f8-4037-88ac-bfda8292a209"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.091228 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4ffbb8c-37f8-4037-88ac-bfda8292a209" (UID: "e4ffbb8c-37f8-4037-88ac-bfda8292a209"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.099456 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ffbb8c-37f8-4037-88ac-bfda8292a209-kube-api-access-29lpf" (OuterVolumeSpecName: "kube-api-access-29lpf") pod "e4ffbb8c-37f8-4037-88ac-bfda8292a209" (UID: "e4ffbb8c-37f8-4037-88ac-bfda8292a209"). InnerVolumeSpecName "kube-api-access-29lpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.099653 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-scripts" (OuterVolumeSpecName: "scripts") pod "e4ffbb8c-37f8-4037-88ac-bfda8292a209" (UID: "e4ffbb8c-37f8-4037-88ac-bfda8292a209"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.122339 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4ffbb8c-37f8-4037-88ac-bfda8292a209" (UID: "e4ffbb8c-37f8-4037-88ac-bfda8292a209"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.154445 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e4ffbb8c-37f8-4037-88ac-bfda8292a209" (UID: "e4ffbb8c-37f8-4037-88ac-bfda8292a209"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.182099 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4ffbb8c-37f8-4037-88ac-bfda8292a209" (UID: "e4ffbb8c-37f8-4037-88ac-bfda8292a209"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.192501 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.192545 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.192579 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.192589 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.192601 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29lpf\" (UniqueName: \"kubernetes.io/projected/e4ffbb8c-37f8-4037-88ac-bfda8292a209-kube-api-access-29lpf\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.192609 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.192617 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ffbb8c-37f8-4037-88ac-bfda8292a209-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.204777 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-config-data" (OuterVolumeSpecName: "config-data") pod "e4ffbb8c-37f8-4037-88ac-bfda8292a209" (UID: "e4ffbb8c-37f8-4037-88ac-bfda8292a209"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.294512 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ffbb8c-37f8-4037-88ac-bfda8292a209-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.553200 4907 generic.go:334] "Generic (PLEG): container finished" podID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerID="12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3" exitCode=0 Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.553279 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.553300 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerDied","Data":"12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3"} Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.553363 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ffbb8c-37f8-4037-88ac-bfda8292a209","Type":"ContainerDied","Data":"6dbad8e1158d10f39e9a611211ac4d91fc200646fde97a1df11d38e451d6525b"} Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.553387 4907 scope.go:117] "RemoveContainer" containerID="1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.588448 4907 scope.go:117] "RemoveContainer" containerID="5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.591945 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.609373 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.616091 4907 scope.go:117] "RemoveContainer" containerID="12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.620595 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:46 crc kubenswrapper[4907]: E1009 19:48:46.621077 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="ceilometer-central-agent" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.621111 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="ceilometer-central-agent" Oct 09 19:48:46 crc kubenswrapper[4907]: E1009 19:48:46.621133 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="ceilometer-notification-agent" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.621141 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="ceilometer-notification-agent" Oct 09 19:48:46 crc kubenswrapper[4907]: E1009 19:48:46.621170 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="proxy-httpd" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.621180 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="proxy-httpd" Oct 09 19:48:46 crc kubenswrapper[4907]: E1009 19:48:46.621202 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="sg-core" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.621210 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="sg-core" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.621456 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="ceilometer-notification-agent" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.621498 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="ceilometer-central-agent" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.621520 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="proxy-httpd" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.621543 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" containerName="sg-core" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.630510 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.638234 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.638501 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.638925 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.641707 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.662526 4907 scope.go:117] "RemoveContainer" containerID="8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.698837 4907 scope.go:117] "RemoveContainer" containerID="1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c" Oct 09 19:48:46 crc kubenswrapper[4907]: E1009 19:48:46.699204 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c\": container with ID starting with 1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c not found: ID does not exist" containerID="1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.699255 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c"} err="failed to get container status \"1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c\": rpc error: code = NotFound desc = could not find container \"1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c\": container with ID starting with 1d4321593f1f3910e5a0e06082e5e7ec1220936e9555168dcdf3cbc750fe6f2c not found: ID does not exist" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.699276 4907 scope.go:117] "RemoveContainer" containerID="5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54" Oct 09 19:48:46 crc kubenswrapper[4907]: E1009 19:48:46.699522 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54\": container with ID starting with 5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54 not found: ID does not exist" containerID="5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.699550 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54"} err="failed to get container status \"5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54\": rpc error: code = NotFound desc = could not find container \"5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54\": container with ID starting with 5285e8e79fa18a79daa7f2bdac560dcf83569538bd12414909b7274fe31b5b54 not found: ID does not exist" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.699566 4907 scope.go:117] "RemoveContainer" containerID="12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3" Oct 09 19:48:46 crc kubenswrapper[4907]: E1009 19:48:46.699803 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3\": container with ID starting with 12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3 not found: ID does not exist" containerID="12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.699828 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3"} err="failed to get container status \"12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3\": rpc error: code = NotFound desc = could not find container \"12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3\": container with ID starting with 12c8c9c689fe7e2ce7b5e5ce699e3f5f7eb3e902d9c8061eaa9cd5e24afe18a3 not found: ID does not exist" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.699844 4907 scope.go:117] "RemoveContainer" containerID="8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053" Oct 09 19:48:46 crc kubenswrapper[4907]: E1009 19:48:46.700196 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053\": container with ID starting with 8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053 not found: ID does not exist" containerID="8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.700231 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053"} err="failed to get container status \"8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053\": rpc error: code = NotFound desc = could not find container \"8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053\": container with ID starting with 8c7654d43f18f9ad824f4d7bcbd2f973b0423eb104dd206423a38a1e32e06053 not found: ID does not exist" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.705213 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-log-httpd\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.705285 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.705316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-run-httpd\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.705344 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-scripts\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.705721 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.706185 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.706252 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-config-data\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.706438 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnvw\" (UniqueName: \"kubernetes.io/projected/7ece5540-61b8-4f64-b55d-d3a93be86382-kube-api-access-fnnvw\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.808541 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-scripts\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.808643 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.808671 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.808692 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-config-data\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.808730 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnvw\" (UniqueName: \"kubernetes.io/projected/7ece5540-61b8-4f64-b55d-d3a93be86382-kube-api-access-fnnvw\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.808756 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-log-httpd\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.808800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.808818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-run-httpd\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.809777 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-log-httpd\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.809972 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-run-httpd\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.812617 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.812675 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.813281 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-scripts\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.813459 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-config-data\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.823513 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.826137 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnvw\" (UniqueName: \"kubernetes.io/projected/7ece5540-61b8-4f64-b55d-d3a93be86382-kube-api-access-fnnvw\") pod \"ceilometer-0\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " pod="openstack/ceilometer-0" Oct 09 19:48:46 crc kubenswrapper[4907]: I1009 19:48:46.998399 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:48:47 crc kubenswrapper[4907]: I1009 19:48:47.165521 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ffbb8c-37f8-4037-88ac-bfda8292a209" path="/var/lib/kubelet/pods/e4ffbb8c-37f8-4037-88ac-bfda8292a209/volumes" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.073317 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:48:48 crc kubenswrapper[4907]: W1009 19:48:48.077197 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ece5540_61b8_4f64_b55d_d3a93be86382.slice/crio-abf6d1fd46ddc94455e30420751c8b48c44f61bb7f587ec5721e15f2416ced7f WatchSource:0}: Error finding container abf6d1fd46ddc94455e30420751c8b48c44f61bb7f587ec5721e15f2416ced7f: Status 404 returned error can't find the container with id abf6d1fd46ddc94455e30420751c8b48c44f61bb7f587ec5721e15f2416ced7f Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.080148 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.174866 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.339222 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-combined-ca-bundle\") pod \"c4ecf8b6-cb47-4096-85e1-4286f45529db\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.339408 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9w99\" (UniqueName: \"kubernetes.io/projected/c4ecf8b6-cb47-4096-85e1-4286f45529db-kube-api-access-c9w99\") pod \"c4ecf8b6-cb47-4096-85e1-4286f45529db\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.339447 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-config-data\") pod \"c4ecf8b6-cb47-4096-85e1-4286f45529db\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.339492 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ecf8b6-cb47-4096-85e1-4286f45529db-logs\") pod \"c4ecf8b6-cb47-4096-85e1-4286f45529db\" (UID: \"c4ecf8b6-cb47-4096-85e1-4286f45529db\") " Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.340475 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ecf8b6-cb47-4096-85e1-4286f45529db-logs" (OuterVolumeSpecName: "logs") pod "c4ecf8b6-cb47-4096-85e1-4286f45529db" (UID: "c4ecf8b6-cb47-4096-85e1-4286f45529db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.346583 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ecf8b6-cb47-4096-85e1-4286f45529db-kube-api-access-c9w99" (OuterVolumeSpecName: "kube-api-access-c9w99") pod "c4ecf8b6-cb47-4096-85e1-4286f45529db" (UID: "c4ecf8b6-cb47-4096-85e1-4286f45529db"). InnerVolumeSpecName "kube-api-access-c9w99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.375300 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4ecf8b6-cb47-4096-85e1-4286f45529db" (UID: "c4ecf8b6-cb47-4096-85e1-4286f45529db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.383281 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-config-data" (OuterVolumeSpecName: "config-data") pod "c4ecf8b6-cb47-4096-85e1-4286f45529db" (UID: "c4ecf8b6-cb47-4096-85e1-4286f45529db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.442180 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9w99\" (UniqueName: \"kubernetes.io/projected/c4ecf8b6-cb47-4096-85e1-4286f45529db-kube-api-access-c9w99\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.442219 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.442234 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4ecf8b6-cb47-4096-85e1-4286f45529db-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.442245 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4ecf8b6-cb47-4096-85e1-4286f45529db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.579781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerStarted","Data":"abf6d1fd46ddc94455e30420751c8b48c44f61bb7f587ec5721e15f2416ced7f"} Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.582315 4907 generic.go:334] "Generic (PLEG): container finished" podID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerID="4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214" exitCode=0 Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.582386 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4ecf8b6-cb47-4096-85e1-4286f45529db","Type":"ContainerDied","Data":"4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214"} Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.582412 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4ecf8b6-cb47-4096-85e1-4286f45529db","Type":"ContainerDied","Data":"89857638abda854439a41ca595d71dfadf3440c8b5e87217a1241441fc6e7eb5"} Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.582428 4907 scope.go:117] "RemoveContainer" containerID="4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.582532 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.606738 4907 scope.go:117] "RemoveContainer" containerID="157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.632620 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.637080 4907 scope.go:117] "RemoveContainer" containerID="4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214" Oct 09 19:48:48 crc kubenswrapper[4907]: E1009 19:48:48.637745 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214\": container with ID starting with 4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214 not found: ID does not exist" containerID="4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.637926 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214"} err="failed to get container status \"4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214\": rpc error: code = NotFound desc = could not find container \"4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214\": container with ID starting with 4a6acdd998fbe9fca88e3406460fe446744c9ae52de6a449a5108d128ac1a214 not found: ID does not exist" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.637998 4907 scope.go:117] "RemoveContainer" containerID="157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988" Oct 09 19:48:48 crc kubenswrapper[4907]: E1009 19:48:48.638337 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988\": container with ID starting with 157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988 not found: ID does not exist" containerID="157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.638366 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988"} err="failed to get container status \"157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988\": rpc error: code = NotFound desc = could not find container \"157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988\": container with ID starting with 157c24222dfdfd40078bc40fd17a0b3ee95e6953c0c84a7a1f6ccba022941988 not found: ID does not exist" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.644359 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.694062 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:48 crc kubenswrapper[4907]: E1009 19:48:48.694583 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-log" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.694602 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-log" Oct 09 19:48:48 crc kubenswrapper[4907]: E1009 19:48:48.694642 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-api" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.694651 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-api" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.694875 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-api" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.694895 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" containerName="nova-api-log" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.696589 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.698261 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.698320 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.698495 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.706406 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.849393 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-public-tls-certs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.849528 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-config-data\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.849589 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc8pl\" (UniqueName: \"kubernetes.io/projected/7006ac30-161a-41ef-a0d2-47ef2de30194-kube-api-access-vc8pl\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.849626 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.849695 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7006ac30-161a-41ef-a0d2-47ef2de30194-logs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.849720 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.951384 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7006ac30-161a-41ef-a0d2-47ef2de30194-logs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.951431 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.951481 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-public-tls-certs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.951546 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-config-data\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.951587 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc8pl\" (UniqueName: \"kubernetes.io/projected/7006ac30-161a-41ef-a0d2-47ef2de30194-kube-api-access-vc8pl\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.951612 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.952476 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7006ac30-161a-41ef-a0d2-47ef2de30194-logs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.956809 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-config-data\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.957391 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.967324 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-public-tls-certs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.967821 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:48 crc kubenswrapper[4907]: I1009 19:48:48.974955 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc8pl\" (UniqueName: \"kubernetes.io/projected/7006ac30-161a-41ef-a0d2-47ef2de30194-kube-api-access-vc8pl\") pod \"nova-api-0\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " pod="openstack/nova-api-0" Oct 09 19:48:49 crc kubenswrapper[4907]: I1009 19:48:49.020431 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:49 crc kubenswrapper[4907]: I1009 19:48:49.191630 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ecf8b6-cb47-4096-85e1-4286f45529db" path="/var/lib/kubelet/pods/c4ecf8b6-cb47-4096-85e1-4286f45529db/volumes" Oct 09 19:48:49 crc kubenswrapper[4907]: I1009 19:48:49.462348 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:49 crc kubenswrapper[4907]: W1009 19:48:49.473393 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7006ac30_161a_41ef_a0d2_47ef2de30194.slice/crio-506ee34514b38771ca02033d6dc9440cf3ed9736e5f114597eef6ab88d527364 WatchSource:0}: Error finding container 506ee34514b38771ca02033d6dc9440cf3ed9736e5f114597eef6ab88d527364: Status 404 returned error can't find the container with id 506ee34514b38771ca02033d6dc9440cf3ed9736e5f114597eef6ab88d527364 Oct 09 19:48:49 crc kubenswrapper[4907]: I1009 19:48:49.595553 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7006ac30-161a-41ef-a0d2-47ef2de30194","Type":"ContainerStarted","Data":"506ee34514b38771ca02033d6dc9440cf3ed9736e5f114597eef6ab88d527364"} Oct 09 19:48:49 crc kubenswrapper[4907]: I1009 19:48:49.596903 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerStarted","Data":"fabfcc1179c1e16af17e3435771abec67ea9abf35a6740e22c4cd40b847bcd83"} Oct 09 19:48:49 crc kubenswrapper[4907]: I1009 19:48:49.596934 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerStarted","Data":"eb5fcdc04ee77dd56e0078b836494394e233ace9cd3a774d3636a7a77e702697"} Oct 09 19:48:49 crc kubenswrapper[4907]: I1009 19:48:49.826523 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:49 crc kubenswrapper[4907]: I1009 19:48:49.907522 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.609849 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7006ac30-161a-41ef-a0d2-47ef2de30194","Type":"ContainerStarted","Data":"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6"} Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.610188 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7006ac30-161a-41ef-a0d2-47ef2de30194","Type":"ContainerStarted","Data":"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45"} Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.615530 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerStarted","Data":"3df386cada414505981df7b8db6dbc6c790994ac71a223083363d31e65859eb4"} Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.638309 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.638292274 podStartE2EDuration="2.638292274s" podCreationTimestamp="2025-10-09 19:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:50.633391295 +0000 UTC m=+1216.165358804" watchObservedRunningTime="2025-10-09 19:48:50.638292274 +0000 UTC m=+1216.170259763" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.648560 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.823459 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vttg4"] Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.825121 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.827230 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.827688 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.841812 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vttg4"] Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.885633 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-scripts\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.885697 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxpvr\" (UniqueName: \"kubernetes.io/projected/9565b208-b543-4da1-ba7e-fcf358d55bdb-kube-api-access-lxpvr\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.885739 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.885875 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-config-data\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.987516 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxpvr\" (UniqueName: \"kubernetes.io/projected/9565b208-b543-4da1-ba7e-fcf358d55bdb-kube-api-access-lxpvr\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.987583 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.987706 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-config-data\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.987739 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-scripts\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.992686 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-scripts\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:50 crc kubenswrapper[4907]: I1009 19:48:50.993122 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-config-data\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:51 crc kubenswrapper[4907]: I1009 19:48:51.012399 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:51 crc kubenswrapper[4907]: I1009 19:48:51.017896 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxpvr\" (UniqueName: \"kubernetes.io/projected/9565b208-b543-4da1-ba7e-fcf358d55bdb-kube-api-access-lxpvr\") pod \"nova-cell1-cell-mapping-vttg4\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:51 crc kubenswrapper[4907]: I1009 19:48:51.141105 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:51 crc kubenswrapper[4907]: I1009 19:48:51.628507 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerStarted","Data":"51821cad36010203510dc30e22dc2e3f8e888516a73790af43b2d2817ab05de4"} Oct 09 19:48:51 crc kubenswrapper[4907]: I1009 19:48:51.629304 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 19:48:51 crc kubenswrapper[4907]: I1009 19:48:51.655637 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.534907407 podStartE2EDuration="5.655615598s" podCreationTimestamp="2025-10-09 19:48:46 +0000 UTC" firstStartedPulling="2025-10-09 19:48:48.079742992 +0000 UTC m=+1213.611710481" lastFinishedPulling="2025-10-09 19:48:51.200451183 +0000 UTC m=+1216.732418672" observedRunningTime="2025-10-09 19:48:51.650690619 +0000 UTC m=+1217.182658118" watchObservedRunningTime="2025-10-09 19:48:51.655615598 +0000 UTC m=+1217.187583097" Oct 09 19:48:51 crc kubenswrapper[4907]: I1009 19:48:51.679308 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vttg4"] Oct 09 19:48:51 crc kubenswrapper[4907]: W1009 19:48:51.685646 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9565b208_b543_4da1_ba7e_fcf358d55bdb.slice/crio-9c84a90fbce156c9a75ea771571df516ba349f4933b3ba03fffa67e593821e45 WatchSource:0}: Error finding container 9c84a90fbce156c9a75ea771571df516ba349f4933b3ba03fffa67e593821e45: Status 404 returned error can't find the container with id 9c84a90fbce156c9a75ea771571df516ba349f4933b3ba03fffa67e593821e45 Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.053661 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.142373 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-tkk4m"] Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.142636 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" podUID="ea4853ad-8428-43e1-839b-788a9e672eec" containerName="dnsmasq-dns" containerID="cri-o://cab012b751061895123e234f0961d089c5b413dab465ae86c1432711255f56ee" gracePeriod=10 Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.677141 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vttg4" event={"ID":"9565b208-b543-4da1-ba7e-fcf358d55bdb","Type":"ContainerStarted","Data":"c9be4375f904a7e34c412ca79227428d7391399f69c83bc8e6f2e678e17f9b78"} Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.677586 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vttg4" event={"ID":"9565b208-b543-4da1-ba7e-fcf358d55bdb","Type":"ContainerStarted","Data":"9c84a90fbce156c9a75ea771571df516ba349f4933b3ba03fffa67e593821e45"} Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.682784 4907 generic.go:334] "Generic (PLEG): container finished" podID="ea4853ad-8428-43e1-839b-788a9e672eec" containerID="cab012b751061895123e234f0961d089c5b413dab465ae86c1432711255f56ee" exitCode=0 Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.683220 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" event={"ID":"ea4853ad-8428-43e1-839b-788a9e672eec","Type":"ContainerDied","Data":"cab012b751061895123e234f0961d089c5b413dab465ae86c1432711255f56ee"} Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.683290 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" event={"ID":"ea4853ad-8428-43e1-839b-788a9e672eec","Type":"ContainerDied","Data":"d48631948bc0c7a8f3e99fbd80707ecc578095cc366aba5b854ed7d02393b922"} Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.683308 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d48631948bc0c7a8f3e99fbd80707ecc578095cc366aba5b854ed7d02393b922" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.697678 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.701212 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vttg4" podStartSLOduration=2.701184227 podStartE2EDuration="2.701184227s" podCreationTimestamp="2025-10-09 19:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:48:52.696052643 +0000 UTC m=+1218.228020132" watchObservedRunningTime="2025-10-09 19:48:52.701184227 +0000 UTC m=+1218.233151726" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.873819 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-config\") pod \"ea4853ad-8428-43e1-839b-788a9e672eec\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.873990 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-sb\") pod \"ea4853ad-8428-43e1-839b-788a9e672eec\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.874110 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-swift-storage-0\") pod \"ea4853ad-8428-43e1-839b-788a9e672eec\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.874164 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-svc\") pod \"ea4853ad-8428-43e1-839b-788a9e672eec\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.874199 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjhcl\" (UniqueName: \"kubernetes.io/projected/ea4853ad-8428-43e1-839b-788a9e672eec-kube-api-access-mjhcl\") pod \"ea4853ad-8428-43e1-839b-788a9e672eec\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.874226 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-nb\") pod \"ea4853ad-8428-43e1-839b-788a9e672eec\" (UID: \"ea4853ad-8428-43e1-839b-788a9e672eec\") " Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.891381 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4853ad-8428-43e1-839b-788a9e672eec-kube-api-access-mjhcl" (OuterVolumeSpecName: "kube-api-access-mjhcl") pod "ea4853ad-8428-43e1-839b-788a9e672eec" (UID: "ea4853ad-8428-43e1-839b-788a9e672eec"). InnerVolumeSpecName "kube-api-access-mjhcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.927562 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-config" (OuterVolumeSpecName: "config") pod "ea4853ad-8428-43e1-839b-788a9e672eec" (UID: "ea4853ad-8428-43e1-839b-788a9e672eec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.932673 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea4853ad-8428-43e1-839b-788a9e672eec" (UID: "ea4853ad-8428-43e1-839b-788a9e672eec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.938232 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ea4853ad-8428-43e1-839b-788a9e672eec" (UID: "ea4853ad-8428-43e1-839b-788a9e672eec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.938909 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea4853ad-8428-43e1-839b-788a9e672eec" (UID: "ea4853ad-8428-43e1-839b-788a9e672eec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.943151 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea4853ad-8428-43e1-839b-788a9e672eec" (UID: "ea4853ad-8428-43e1-839b-788a9e672eec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.976827 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.976865 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.976879 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjhcl\" (UniqueName: \"kubernetes.io/projected/ea4853ad-8428-43e1-839b-788a9e672eec-kube-api-access-mjhcl\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.976890 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.976902 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:52 crc kubenswrapper[4907]: I1009 19:48:52.976913 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea4853ad-8428-43e1-839b-788a9e672eec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:53 crc kubenswrapper[4907]: I1009 19:48:53.693603 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" Oct 09 19:48:53 crc kubenswrapper[4907]: I1009 19:48:53.724411 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-tkk4m"] Oct 09 19:48:53 crc kubenswrapper[4907]: I1009 19:48:53.734666 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-tkk4m"] Oct 09 19:48:55 crc kubenswrapper[4907]: I1009 19:48:55.164789 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4853ad-8428-43e1-839b-788a9e672eec" path="/var/lib/kubelet/pods/ea4853ad-8428-43e1-839b-788a9e672eec/volumes" Oct 09 19:48:56 crc kubenswrapper[4907]: I1009 19:48:56.725169 4907 generic.go:334] "Generic (PLEG): container finished" podID="9565b208-b543-4da1-ba7e-fcf358d55bdb" containerID="c9be4375f904a7e34c412ca79227428d7391399f69c83bc8e6f2e678e17f9b78" exitCode=0 Oct 09 19:48:56 crc kubenswrapper[4907]: I1009 19:48:56.725268 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vttg4" event={"ID":"9565b208-b543-4da1-ba7e-fcf358d55bdb","Type":"ContainerDied","Data":"c9be4375f904a7e34c412ca79227428d7391399f69c83bc8e6f2e678e17f9b78"} Oct 09 19:48:57 crc kubenswrapper[4907]: I1009 19:48:57.427916 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-845d6d6f59-tkk4m" podUID="ea4853ad-8428-43e1-839b-788a9e672eec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.193:5353: i/o timeout" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.091059 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.176620 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-combined-ca-bundle\") pod \"9565b208-b543-4da1-ba7e-fcf358d55bdb\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.176683 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-config-data\") pod \"9565b208-b543-4da1-ba7e-fcf358d55bdb\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.176713 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-scripts\") pod \"9565b208-b543-4da1-ba7e-fcf358d55bdb\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.176834 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxpvr\" (UniqueName: \"kubernetes.io/projected/9565b208-b543-4da1-ba7e-fcf358d55bdb-kube-api-access-lxpvr\") pod \"9565b208-b543-4da1-ba7e-fcf358d55bdb\" (UID: \"9565b208-b543-4da1-ba7e-fcf358d55bdb\") " Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.182089 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9565b208-b543-4da1-ba7e-fcf358d55bdb-kube-api-access-lxpvr" (OuterVolumeSpecName: "kube-api-access-lxpvr") pod "9565b208-b543-4da1-ba7e-fcf358d55bdb" (UID: "9565b208-b543-4da1-ba7e-fcf358d55bdb"). InnerVolumeSpecName "kube-api-access-lxpvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.184516 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-scripts" (OuterVolumeSpecName: "scripts") pod "9565b208-b543-4da1-ba7e-fcf358d55bdb" (UID: "9565b208-b543-4da1-ba7e-fcf358d55bdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.204714 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-config-data" (OuterVolumeSpecName: "config-data") pod "9565b208-b543-4da1-ba7e-fcf358d55bdb" (UID: "9565b208-b543-4da1-ba7e-fcf358d55bdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.208205 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9565b208-b543-4da1-ba7e-fcf358d55bdb" (UID: "9565b208-b543-4da1-ba7e-fcf358d55bdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.278734 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxpvr\" (UniqueName: \"kubernetes.io/projected/9565b208-b543-4da1-ba7e-fcf358d55bdb-kube-api-access-lxpvr\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.278800 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.278814 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.278822 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9565b208-b543-4da1-ba7e-fcf358d55bdb-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.750879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vttg4" event={"ID":"9565b208-b543-4da1-ba7e-fcf358d55bdb","Type":"ContainerDied","Data":"9c84a90fbce156c9a75ea771571df516ba349f4933b3ba03fffa67e593821e45"} Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.750932 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c84a90fbce156c9a75ea771571df516ba349f4933b3ba03fffa67e593821e45" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.750979 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vttg4" Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.928429 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.928706 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerName="nova-api-log" containerID="cri-o://798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45" gracePeriod=30 Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.928804 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerName="nova-api-api" containerID="cri-o://03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6" gracePeriod=30 Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.942443 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.943533 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c7aa1be1-c204-41b1-8cf2-fc77138a5673" containerName="nova-scheduler-scheduler" containerID="cri-o://baacca85ab4e811e75f032a0475f9d7ef5b20f0a2d5ecd62f8ea4b01a7a02794" gracePeriod=30 Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.956179 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.956420 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-log" containerID="cri-o://2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0" gracePeriod=30 Oct 09 19:48:58 crc kubenswrapper[4907]: I1009 19:48:58.956861 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-metadata" containerID="cri-o://e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46" gracePeriod=30 Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.556336 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.710167 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-public-tls-certs\") pod \"7006ac30-161a-41ef-a0d2-47ef2de30194\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.710230 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-internal-tls-certs\") pod \"7006ac30-161a-41ef-a0d2-47ef2de30194\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.711033 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc8pl\" (UniqueName: \"kubernetes.io/projected/7006ac30-161a-41ef-a0d2-47ef2de30194-kube-api-access-vc8pl\") pod \"7006ac30-161a-41ef-a0d2-47ef2de30194\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.711075 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-config-data\") pod \"7006ac30-161a-41ef-a0d2-47ef2de30194\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.711128 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7006ac30-161a-41ef-a0d2-47ef2de30194-logs\") pod \"7006ac30-161a-41ef-a0d2-47ef2de30194\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.711155 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-combined-ca-bundle\") pod \"7006ac30-161a-41ef-a0d2-47ef2de30194\" (UID: \"7006ac30-161a-41ef-a0d2-47ef2de30194\") " Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.711565 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7006ac30-161a-41ef-a0d2-47ef2de30194-logs" (OuterVolumeSpecName: "logs") pod "7006ac30-161a-41ef-a0d2-47ef2de30194" (UID: "7006ac30-161a-41ef-a0d2-47ef2de30194"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.711892 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7006ac30-161a-41ef-a0d2-47ef2de30194-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.715585 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7006ac30-161a-41ef-a0d2-47ef2de30194-kube-api-access-vc8pl" (OuterVolumeSpecName: "kube-api-access-vc8pl") pod "7006ac30-161a-41ef-a0d2-47ef2de30194" (UID: "7006ac30-161a-41ef-a0d2-47ef2de30194"). InnerVolumeSpecName "kube-api-access-vc8pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.742996 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-config-data" (OuterVolumeSpecName: "config-data") pod "7006ac30-161a-41ef-a0d2-47ef2de30194" (UID: "7006ac30-161a-41ef-a0d2-47ef2de30194"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.745599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7006ac30-161a-41ef-a0d2-47ef2de30194" (UID: "7006ac30-161a-41ef-a0d2-47ef2de30194"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.762265 4907 generic.go:334] "Generic (PLEG): container finished" podID="45cabc27-dfc9-4030-9508-cd366682d788" containerID="2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0" exitCode=143 Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.762355 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45cabc27-dfc9-4030-9508-cd366682d788","Type":"ContainerDied","Data":"2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0"} Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.765185 4907 generic.go:334] "Generic (PLEG): container finished" podID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerID="03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6" exitCode=0 Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.765209 4907 generic.go:334] "Generic (PLEG): container finished" podID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerID="798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45" exitCode=143 Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.765229 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7006ac30-161a-41ef-a0d2-47ef2de30194","Type":"ContainerDied","Data":"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6"} Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.765248 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7006ac30-161a-41ef-a0d2-47ef2de30194","Type":"ContainerDied","Data":"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45"} Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.765260 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7006ac30-161a-41ef-a0d2-47ef2de30194","Type":"ContainerDied","Data":"506ee34514b38771ca02033d6dc9440cf3ed9736e5f114597eef6ab88d527364"} Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.765275 4907 scope.go:117] "RemoveContainer" containerID="03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.765617 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.769130 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7006ac30-161a-41ef-a0d2-47ef2de30194" (UID: "7006ac30-161a-41ef-a0d2-47ef2de30194"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.778351 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7006ac30-161a-41ef-a0d2-47ef2de30194" (UID: "7006ac30-161a-41ef-a0d2-47ef2de30194"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.810714 4907 scope.go:117] "RemoveContainer" containerID="798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.813229 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.813260 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.813274 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.813289 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc8pl\" (UniqueName: \"kubernetes.io/projected/7006ac30-161a-41ef-a0d2-47ef2de30194-kube-api-access-vc8pl\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.813302 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7006ac30-161a-41ef-a0d2-47ef2de30194-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.833405 4907 scope.go:117] "RemoveContainer" containerID="03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6" Oct 09 19:48:59 crc kubenswrapper[4907]: E1009 19:48:59.833919 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6\": container with ID starting with 03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6 not found: ID does not exist" containerID="03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.833960 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6"} err="failed to get container status \"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6\": rpc error: code = NotFound desc = could not find container \"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6\": container with ID starting with 03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6 not found: ID does not exist" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.833987 4907 scope.go:117] "RemoveContainer" containerID="798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45" Oct 09 19:48:59 crc kubenswrapper[4907]: E1009 19:48:59.834273 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45\": container with ID starting with 798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45 not found: ID does not exist" containerID="798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.834308 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45"} err="failed to get container status \"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45\": rpc error: code = NotFound desc = could not find container \"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45\": container with ID starting with 798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45 not found: ID does not exist" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.834328 4907 scope.go:117] "RemoveContainer" containerID="03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.834657 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6"} err="failed to get container status \"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6\": rpc error: code = NotFound desc = could not find container \"03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6\": container with ID starting with 03f5918ba329da78e80c0506e2ffc411f53de30688591849aa24fae32ce988b6 not found: ID does not exist" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.834685 4907 scope.go:117] "RemoveContainer" containerID="798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45" Oct 09 19:48:59 crc kubenswrapper[4907]: I1009 19:48:59.834885 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45"} err="failed to get container status \"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45\": rpc error: code = NotFound desc = could not find container \"798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45\": container with ID starting with 798cd0e83f66ce54cf1710a4ff4dd3707dc6e021211825860765a8754bc1fe45 not found: ID does not exist" Oct 09 19:48:59 crc kubenswrapper[4907]: E1009 19:48:59.853132 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="baacca85ab4e811e75f032a0475f9d7ef5b20f0a2d5ecd62f8ea4b01a7a02794" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 19:48:59 crc kubenswrapper[4907]: E1009 19:48:59.854792 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="baacca85ab4e811e75f032a0475f9d7ef5b20f0a2d5ecd62f8ea4b01a7a02794" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 19:48:59 crc kubenswrapper[4907]: E1009 19:48:59.856493 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="baacca85ab4e811e75f032a0475f9d7ef5b20f0a2d5ecd62f8ea4b01a7a02794" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 19:48:59 crc kubenswrapper[4907]: E1009 19:48:59.856596 4907 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c7aa1be1-c204-41b1-8cf2-fc77138a5673" containerName="nova-scheduler-scheduler" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.160235 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.168781 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.194374 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 19:49:00 crc kubenswrapper[4907]: E1009 19:49:00.195259 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerName="nova-api-log" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.195409 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerName="nova-api-log" Oct 09 19:49:00 crc kubenswrapper[4907]: E1009 19:49:00.195565 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerName="nova-api-api" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.195669 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerName="nova-api-api" Oct 09 19:49:00 crc kubenswrapper[4907]: E1009 19:49:00.195816 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9565b208-b543-4da1-ba7e-fcf358d55bdb" containerName="nova-manage" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.196061 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9565b208-b543-4da1-ba7e-fcf358d55bdb" containerName="nova-manage" Oct 09 19:49:00 crc kubenswrapper[4907]: E1009 19:49:00.196186 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4853ad-8428-43e1-839b-788a9e672eec" containerName="init" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.196321 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4853ad-8428-43e1-839b-788a9e672eec" containerName="init" Oct 09 19:49:00 crc kubenswrapper[4907]: E1009 19:49:00.196547 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4853ad-8428-43e1-839b-788a9e672eec" containerName="dnsmasq-dns" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.196638 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4853ad-8428-43e1-839b-788a9e672eec" containerName="dnsmasq-dns" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.197029 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerName="nova-api-log" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.197055 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" containerName="nova-api-api" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.197068 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4853ad-8428-43e1-839b-788a9e672eec" containerName="dnsmasq-dns" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.197113 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9565b208-b543-4da1-ba7e-fcf358d55bdb" containerName="nova-manage" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.198670 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.200891 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.200945 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.201878 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.204660 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.328599 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.328664 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.328688 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-config-data\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.328747 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83277287-28b0-43e3-98e7-e8367e7a87d9-logs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.328788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxfn\" (UniqueName: \"kubernetes.io/projected/83277287-28b0-43e3-98e7-e8367e7a87d9-kube-api-access-6vxfn\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.328829 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.430569 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.430941 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.430985 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.431005 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-config-data\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.431055 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83277287-28b0-43e3-98e7-e8367e7a87d9-logs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.431092 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxfn\" (UniqueName: \"kubernetes.io/projected/83277287-28b0-43e3-98e7-e8367e7a87d9-kube-api-access-6vxfn\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.431671 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83277287-28b0-43e3-98e7-e8367e7a87d9-logs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.434571 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.434894 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.434970 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.436196 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83277287-28b0-43e3-98e7-e8367e7a87d9-config-data\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.448063 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxfn\" (UniqueName: \"kubernetes.io/projected/83277287-28b0-43e3-98e7-e8367e7a87d9-kube-api-access-6vxfn\") pod \"nova-api-0\" (UID: \"83277287-28b0-43e3-98e7-e8367e7a87d9\") " pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.518094 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 19:49:00 crc kubenswrapper[4907]: I1009 19:49:00.940549 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 19:49:00 crc kubenswrapper[4907]: W1009 19:49:00.943419 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83277287_28b0_43e3_98e7_e8367e7a87d9.slice/crio-8c5f69c6cec85be43a5f09399a1399c443a7937e8cb4004e7e82da0a9fa3f4ea WatchSource:0}: Error finding container 8c5f69c6cec85be43a5f09399a1399c443a7937e8cb4004e7e82da0a9fa3f4ea: Status 404 returned error can't find the container with id 8c5f69c6cec85be43a5f09399a1399c443a7937e8cb4004e7e82da0a9fa3f4ea Oct 09 19:49:01 crc kubenswrapper[4907]: I1009 19:49:01.169598 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7006ac30-161a-41ef-a0d2-47ef2de30194" path="/var/lib/kubelet/pods/7006ac30-161a-41ef-a0d2-47ef2de30194/volumes" Oct 09 19:49:01 crc kubenswrapper[4907]: I1009 19:49:01.787346 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83277287-28b0-43e3-98e7-e8367e7a87d9","Type":"ContainerStarted","Data":"a7216a702932de2e66941f0a53b2749509e6fe9b92376c2ba246a399ae8b0c98"} Oct 09 19:49:01 crc kubenswrapper[4907]: I1009 19:49:01.787715 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83277287-28b0-43e3-98e7-e8367e7a87d9","Type":"ContainerStarted","Data":"1bc8182992d4b8e6c45f1b9ddba528748ca11a30a5f3f0aff8c4ecf664b489d5"} Oct 09 19:49:01 crc kubenswrapper[4907]: I1009 19:49:01.787727 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83277287-28b0-43e3-98e7-e8367e7a87d9","Type":"ContainerStarted","Data":"8c5f69c6cec85be43a5f09399a1399c443a7937e8cb4004e7e82da0a9fa3f4ea"} Oct 09 19:49:01 crc kubenswrapper[4907]: I1009 19:49:01.805913 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.805893201 podStartE2EDuration="1.805893201s" podCreationTimestamp="2025-10-09 19:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:49:01.804164179 +0000 UTC m=+1227.336131688" watchObservedRunningTime="2025-10-09 19:49:01.805893201 +0000 UTC m=+1227.337860690" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.088497 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:44480->10.217.0.196:8775: read: connection reset by peer" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.088593 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:44478->10.217.0.196:8775: read: connection reset by peer" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.601675 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.673070 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx2t5\" (UniqueName: \"kubernetes.io/projected/45cabc27-dfc9-4030-9508-cd366682d788-kube-api-access-hx2t5\") pod \"45cabc27-dfc9-4030-9508-cd366682d788\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.673160 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cabc27-dfc9-4030-9508-cd366682d788-logs\") pod \"45cabc27-dfc9-4030-9508-cd366682d788\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.673193 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-combined-ca-bundle\") pod \"45cabc27-dfc9-4030-9508-cd366682d788\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.673256 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-config-data\") pod \"45cabc27-dfc9-4030-9508-cd366682d788\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.673291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-nova-metadata-tls-certs\") pod \"45cabc27-dfc9-4030-9508-cd366682d788\" (UID: \"45cabc27-dfc9-4030-9508-cd366682d788\") " Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.675033 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45cabc27-dfc9-4030-9508-cd366682d788-logs" (OuterVolumeSpecName: "logs") pod "45cabc27-dfc9-4030-9508-cd366682d788" (UID: "45cabc27-dfc9-4030-9508-cd366682d788"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.680010 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45cabc27-dfc9-4030-9508-cd366682d788-kube-api-access-hx2t5" (OuterVolumeSpecName: "kube-api-access-hx2t5") pod "45cabc27-dfc9-4030-9508-cd366682d788" (UID: "45cabc27-dfc9-4030-9508-cd366682d788"). InnerVolumeSpecName "kube-api-access-hx2t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.709544 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45cabc27-dfc9-4030-9508-cd366682d788" (UID: "45cabc27-dfc9-4030-9508-cd366682d788"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.719073 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-config-data" (OuterVolumeSpecName: "config-data") pod "45cabc27-dfc9-4030-9508-cd366682d788" (UID: "45cabc27-dfc9-4030-9508-cd366682d788"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.742110 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "45cabc27-dfc9-4030-9508-cd366682d788" (UID: "45cabc27-dfc9-4030-9508-cd366682d788"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.775764 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx2t5\" (UniqueName: \"kubernetes.io/projected/45cabc27-dfc9-4030-9508-cd366682d788-kube-api-access-hx2t5\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.775801 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45cabc27-dfc9-4030-9508-cd366682d788-logs\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.775813 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.775824 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.775834 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cabc27-dfc9-4030-9508-cd366682d788-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.803156 4907 generic.go:334] "Generic (PLEG): container finished" podID="45cabc27-dfc9-4030-9508-cd366682d788" containerID="e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46" exitCode=0 Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.803243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45cabc27-dfc9-4030-9508-cd366682d788","Type":"ContainerDied","Data":"e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46"} Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.803312 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"45cabc27-dfc9-4030-9508-cd366682d788","Type":"ContainerDied","Data":"1f0c81a464e1812414290a6378964e92663681bbffaa7ef7db706aa75fe8678c"} Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.803341 4907 scope.go:117] "RemoveContainer" containerID="e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.803270 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.839528 4907 scope.go:117] "RemoveContainer" containerID="2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.849485 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.865263 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.878560 4907 scope.go:117] "RemoveContainer" containerID="e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46" Oct 09 19:49:02 crc kubenswrapper[4907]: E1009 19:49:02.879134 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46\": container with ID starting with e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46 not found: ID does not exist" containerID="e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.879257 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46"} err="failed to get container status \"e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46\": rpc error: code = NotFound desc = could not find container \"e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46\": container with ID starting with e6307a59d6996640bb6f6ef208d3ff83e6870240d21491eb518e742c77789c46 not found: ID does not exist" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.879363 4907 scope.go:117] "RemoveContainer" containerID="2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0" Oct 09 19:49:02 crc kubenswrapper[4907]: E1009 19:49:02.879828 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0\": container with ID starting with 2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0 not found: ID does not exist" containerID="2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.879921 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0"} err="failed to get container status \"2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0\": rpc error: code = NotFound desc = could not find container \"2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0\": container with ID starting with 2a63506acc0e940673860ba42efc5b681f7d13878cfe94a3023106aa4d5ca4f0 not found: ID does not exist" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.889887 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:49:02 crc kubenswrapper[4907]: E1009 19:49:02.891421 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-metadata" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.891445 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-metadata" Oct 09 19:49:02 crc kubenswrapper[4907]: E1009 19:49:02.891500 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-log" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.891510 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-log" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.891736 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-metadata" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.891767 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="45cabc27-dfc9-4030-9508-cd366682d788" containerName="nova-metadata-log" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.893216 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.897293 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.897439 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.902521 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.979499 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33bc0e25-b33d-4af0-b735-cac7deff34eb-logs\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.979850 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.979977 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-config-data\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.980036 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:02 crc kubenswrapper[4907]: I1009 19:49:02.980061 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6qv\" (UniqueName: \"kubernetes.io/projected/33bc0e25-b33d-4af0-b735-cac7deff34eb-kube-api-access-lq6qv\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.082147 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-config-data\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.082219 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.082241 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6qv\" (UniqueName: \"kubernetes.io/projected/33bc0e25-b33d-4af0-b735-cac7deff34eb-kube-api-access-lq6qv\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.082312 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33bc0e25-b33d-4af0-b735-cac7deff34eb-logs\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.082340 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.082869 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33bc0e25-b33d-4af0-b735-cac7deff34eb-logs\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.086647 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.086803 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.087154 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bc0e25-b33d-4af0-b735-cac7deff34eb-config-data\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.097938 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6qv\" (UniqueName: \"kubernetes.io/projected/33bc0e25-b33d-4af0-b735-cac7deff34eb-kube-api-access-lq6qv\") pod \"nova-metadata-0\" (UID: \"33bc0e25-b33d-4af0-b735-cac7deff34eb\") " pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.165607 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45cabc27-dfc9-4030-9508-cd366682d788" path="/var/lib/kubelet/pods/45cabc27-dfc9-4030-9508-cd366682d788/volumes" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.221649 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.741362 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.817924 4907 generic.go:334] "Generic (PLEG): container finished" podID="c7aa1be1-c204-41b1-8cf2-fc77138a5673" containerID="baacca85ab4e811e75f032a0475f9d7ef5b20f0a2d5ecd62f8ea4b01a7a02794" exitCode=0 Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.817997 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7aa1be1-c204-41b1-8cf2-fc77138a5673","Type":"ContainerDied","Data":"baacca85ab4e811e75f032a0475f9d7ef5b20f0a2d5ecd62f8ea4b01a7a02794"} Oct 09 19:49:03 crc kubenswrapper[4907]: I1009 19:49:03.819311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33bc0e25-b33d-4af0-b735-cac7deff34eb","Type":"ContainerStarted","Data":"8be381f9777d0fa75a11359bac4f9ecec58b5a66ea47b5567ef2a8a869c83d2b"} Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.031582 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.111742 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-config-data\") pod \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.113455 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-combined-ca-bundle\") pod \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.113663 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tvpz\" (UniqueName: \"kubernetes.io/projected/c7aa1be1-c204-41b1-8cf2-fc77138a5673-kube-api-access-7tvpz\") pod \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\" (UID: \"c7aa1be1-c204-41b1-8cf2-fc77138a5673\") " Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.119915 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7aa1be1-c204-41b1-8cf2-fc77138a5673-kube-api-access-7tvpz" (OuterVolumeSpecName: "kube-api-access-7tvpz") pod "c7aa1be1-c204-41b1-8cf2-fc77138a5673" (UID: "c7aa1be1-c204-41b1-8cf2-fc77138a5673"). InnerVolumeSpecName "kube-api-access-7tvpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.145976 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7aa1be1-c204-41b1-8cf2-fc77138a5673" (UID: "c7aa1be1-c204-41b1-8cf2-fc77138a5673"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.147565 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-config-data" (OuterVolumeSpecName: "config-data") pod "c7aa1be1-c204-41b1-8cf2-fc77138a5673" (UID: "c7aa1be1-c204-41b1-8cf2-fc77138a5673"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.219875 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.221593 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa1be1-c204-41b1-8cf2-fc77138a5673-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.221694 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tvpz\" (UniqueName: \"kubernetes.io/projected/c7aa1be1-c204-41b1-8cf2-fc77138a5673-kube-api-access-7tvpz\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.834713 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33bc0e25-b33d-4af0-b735-cac7deff34eb","Type":"ContainerStarted","Data":"1d92f31283b0ac6cee1b988c7043ce415af80c77cd2ef01df31a06c889fa91ed"} Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.835090 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33bc0e25-b33d-4af0-b735-cac7deff34eb","Type":"ContainerStarted","Data":"dab622976fca7d63b2c7b9d12277c84179d5930d11bf91c8ea9d58d0df3cb9af"} Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.837114 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.837099 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7aa1be1-c204-41b1-8cf2-fc77138a5673","Type":"ContainerDied","Data":"6a12607c001f875429ebd2518d22a311d600ee2562d3e6e336403c97e7c3bb26"} Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.837273 4907 scope.go:117] "RemoveContainer" containerID="baacca85ab4e811e75f032a0475f9d7ef5b20f0a2d5ecd62f8ea4b01a7a02794" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.872356 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.872336956 podStartE2EDuration="2.872336956s" podCreationTimestamp="2025-10-09 19:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:49:04.860809587 +0000 UTC m=+1230.392777076" watchObservedRunningTime="2025-10-09 19:49:04.872336956 +0000 UTC m=+1230.404304445" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.901312 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.918433 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.935084 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:49:04 crc kubenswrapper[4907]: E1009 19:49:04.935700 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7aa1be1-c204-41b1-8cf2-fc77138a5673" containerName="nova-scheduler-scheduler" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.935732 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7aa1be1-c204-41b1-8cf2-fc77138a5673" containerName="nova-scheduler-scheduler" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.935985 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7aa1be1-c204-41b1-8cf2-fc77138a5673" containerName="nova-scheduler-scheduler" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.936659 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.940903 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 19:49:04 crc kubenswrapper[4907]: I1009 19:49:04.949827 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.037010 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f087483f-1925-46d1-a58d-c7cf2354fbb1-config-data\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.037068 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f087483f-1925-46d1-a58d-c7cf2354fbb1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.037139 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kdlk\" (UniqueName: \"kubernetes.io/projected/f087483f-1925-46d1-a58d-c7cf2354fbb1-kube-api-access-8kdlk\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.138935 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f087483f-1925-46d1-a58d-c7cf2354fbb1-config-data\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.139005 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f087483f-1925-46d1-a58d-c7cf2354fbb1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.139045 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kdlk\" (UniqueName: \"kubernetes.io/projected/f087483f-1925-46d1-a58d-c7cf2354fbb1-kube-api-access-8kdlk\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.145167 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f087483f-1925-46d1-a58d-c7cf2354fbb1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.150446 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f087483f-1925-46d1-a58d-c7cf2354fbb1-config-data\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.158217 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kdlk\" (UniqueName: \"kubernetes.io/projected/f087483f-1925-46d1-a58d-c7cf2354fbb1-kube-api-access-8kdlk\") pod \"nova-scheduler-0\" (UID: \"f087483f-1925-46d1-a58d-c7cf2354fbb1\") " pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.175044 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7aa1be1-c204-41b1-8cf2-fc77138a5673" path="/var/lib/kubelet/pods/c7aa1be1-c204-41b1-8cf2-fc77138a5673/volumes" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.253892 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.702239 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 19:49:05 crc kubenswrapper[4907]: W1009 19:49:05.710172 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf087483f_1925_46d1_a58d_c7cf2354fbb1.slice/crio-e0632f901ba948a7c7407c43e92c6d9ad1969439f7059734f93d557d2a9ea02c WatchSource:0}: Error finding container e0632f901ba948a7c7407c43e92c6d9ad1969439f7059734f93d557d2a9ea02c: Status 404 returned error can't find the container with id e0632f901ba948a7c7407c43e92c6d9ad1969439f7059734f93d557d2a9ea02c Oct 09 19:49:05 crc kubenswrapper[4907]: I1009 19:49:05.850782 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f087483f-1925-46d1-a58d-c7cf2354fbb1","Type":"ContainerStarted","Data":"e0632f901ba948a7c7407c43e92c6d9ad1969439f7059734f93d557d2a9ea02c"} Oct 09 19:49:06 crc kubenswrapper[4907]: I1009 19:49:06.860790 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f087483f-1925-46d1-a58d-c7cf2354fbb1","Type":"ContainerStarted","Data":"3b31f8b057780c7d1d70d2ebb124773d52ee4de38f8315fa3f099f305dd0f54d"} Oct 09 19:49:06 crc kubenswrapper[4907]: I1009 19:49:06.906156 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.906137578 podStartE2EDuration="2.906137578s" podCreationTimestamp="2025-10-09 19:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:49:06.885187538 +0000 UTC m=+1232.417155117" watchObservedRunningTime="2025-10-09 19:49:06.906137578 +0000 UTC m=+1232.438105067" Oct 09 19:49:08 crc kubenswrapper[4907]: I1009 19:49:08.222716 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 19:49:08 crc kubenswrapper[4907]: I1009 19:49:08.223047 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 19:49:10 crc kubenswrapper[4907]: I1009 19:49:10.254252 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 19:49:10 crc kubenswrapper[4907]: I1009 19:49:10.519071 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 19:49:10 crc kubenswrapper[4907]: I1009 19:49:10.519157 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 19:49:11 crc kubenswrapper[4907]: I1009 19:49:11.535665 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83277287-28b0-43e3-98e7-e8367e7a87d9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 19:49:11 crc kubenswrapper[4907]: I1009 19:49:11.535679 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83277287-28b0-43e3-98e7-e8367e7a87d9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 19:49:13 crc kubenswrapper[4907]: I1009 19:49:13.222961 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 19:49:13 crc kubenswrapper[4907]: I1009 19:49:13.223388 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 19:49:14 crc kubenswrapper[4907]: I1009 19:49:14.239740 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="33bc0e25-b33d-4af0-b735-cac7deff34eb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 19:49:14 crc kubenswrapper[4907]: I1009 19:49:14.239794 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="33bc0e25-b33d-4af0-b735-cac7deff34eb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 19:49:15 crc kubenswrapper[4907]: I1009 19:49:15.254415 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 19:49:15 crc kubenswrapper[4907]: I1009 19:49:15.285028 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 19:49:16 crc kubenswrapper[4907]: I1009 19:49:16.004755 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 19:49:17 crc kubenswrapper[4907]: I1009 19:49:17.010439 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 19:49:20 crc kubenswrapper[4907]: I1009 19:49:20.527985 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 19:49:20 crc kubenswrapper[4907]: I1009 19:49:20.531100 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 19:49:20 crc kubenswrapper[4907]: I1009 19:49:20.532657 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 19:49:20 crc kubenswrapper[4907]: I1009 19:49:20.543395 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 19:49:20 crc kubenswrapper[4907]: I1009 19:49:20.998122 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 19:49:21 crc kubenswrapper[4907]: I1009 19:49:21.003768 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 19:49:23 crc kubenswrapper[4907]: I1009 19:49:23.228011 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 19:49:23 crc kubenswrapper[4907]: I1009 19:49:23.230345 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 19:49:23 crc kubenswrapper[4907]: I1009 19:49:23.236114 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 19:49:24 crc kubenswrapper[4907]: I1009 19:49:24.046251 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 19:49:34 crc kubenswrapper[4907]: I1009 19:49:34.896503 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:49:34 crc kubenswrapper[4907]: I1009 19:49:34.897356 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="ceilometer-central-agent" containerID="cri-o://eb5fcdc04ee77dd56e0078b836494394e233ace9cd3a774d3636a7a77e702697" gracePeriod=30 Oct 09 19:49:34 crc kubenswrapper[4907]: I1009 19:49:34.897439 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="proxy-httpd" containerID="cri-o://51821cad36010203510dc30e22dc2e3f8e888516a73790af43b2d2817ab05de4" gracePeriod=30 Oct 09 19:49:34 crc kubenswrapper[4907]: I1009 19:49:34.897576 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="ceilometer-notification-agent" containerID="cri-o://fabfcc1179c1e16af17e3435771abec67ea9abf35a6740e22c4cd40b847bcd83" gracePeriod=30 Oct 09 19:49:34 crc kubenswrapper[4907]: I1009 19:49:34.897562 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="sg-core" containerID="cri-o://3df386cada414505981df7b8db6dbc6c790994ac71a223083363d31e65859eb4" gracePeriod=30 Oct 09 19:49:35 crc kubenswrapper[4907]: I1009 19:49:35.200630 4907 generic.go:334] "Generic (PLEG): container finished" podID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerID="3df386cada414505981df7b8db6dbc6c790994ac71a223083363d31e65859eb4" exitCode=2 Oct 09 19:49:35 crc kubenswrapper[4907]: I1009 19:49:35.200673 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerDied","Data":"3df386cada414505981df7b8db6dbc6c790994ac71a223083363d31e65859eb4"} Oct 09 19:49:35 crc kubenswrapper[4907]: I1009 19:49:35.535095 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:49:36 crc kubenswrapper[4907]: I1009 19:49:36.212561 4907 generic.go:334] "Generic (PLEG): container finished" podID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerID="51821cad36010203510dc30e22dc2e3f8e888516a73790af43b2d2817ab05de4" exitCode=0 Oct 09 19:49:36 crc kubenswrapper[4907]: I1009 19:49:36.212592 4907 generic.go:334] "Generic (PLEG): container finished" podID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerID="eb5fcdc04ee77dd56e0078b836494394e233ace9cd3a774d3636a7a77e702697" exitCode=0 Oct 09 19:49:36 crc kubenswrapper[4907]: I1009 19:49:36.212612 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerDied","Data":"51821cad36010203510dc30e22dc2e3f8e888516a73790af43b2d2817ab05de4"} Oct 09 19:49:36 crc kubenswrapper[4907]: I1009 19:49:36.212637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerDied","Data":"eb5fcdc04ee77dd56e0078b836494394e233ace9cd3a774d3636a7a77e702697"} Oct 09 19:49:36 crc kubenswrapper[4907]: I1009 19:49:36.640438 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.234147 4907 generic.go:334] "Generic (PLEG): container finished" podID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerID="fabfcc1179c1e16af17e3435771abec67ea9abf35a6740e22c4cd40b847bcd83" exitCode=0 Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.234433 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerDied","Data":"fabfcc1179c1e16af17e3435771abec67ea9abf35a6740e22c4cd40b847bcd83"} Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.598787 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.688292 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-scripts\") pod \"7ece5540-61b8-4f64-b55d-d3a93be86382\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.688335 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-sg-core-conf-yaml\") pod \"7ece5540-61b8-4f64-b55d-d3a93be86382\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.688378 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-combined-ca-bundle\") pod \"7ece5540-61b8-4f64-b55d-d3a93be86382\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.688414 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-config-data\") pod \"7ece5540-61b8-4f64-b55d-d3a93be86382\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.688499 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-ceilometer-tls-certs\") pod \"7ece5540-61b8-4f64-b55d-d3a93be86382\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.688549 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-run-httpd\") pod \"7ece5540-61b8-4f64-b55d-d3a93be86382\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.688588 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-log-httpd\") pod \"7ece5540-61b8-4f64-b55d-d3a93be86382\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.688625 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnnvw\" (UniqueName: \"kubernetes.io/projected/7ece5540-61b8-4f64-b55d-d3a93be86382-kube-api-access-fnnvw\") pod \"7ece5540-61b8-4f64-b55d-d3a93be86382\" (UID: \"7ece5540-61b8-4f64-b55d-d3a93be86382\") " Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.690049 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7ece5540-61b8-4f64-b55d-d3a93be86382" (UID: "7ece5540-61b8-4f64-b55d-d3a93be86382"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.694686 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7ece5540-61b8-4f64-b55d-d3a93be86382" (UID: "7ece5540-61b8-4f64-b55d-d3a93be86382"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.694979 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ece5540-61b8-4f64-b55d-d3a93be86382-kube-api-access-fnnvw" (OuterVolumeSpecName: "kube-api-access-fnnvw") pod "7ece5540-61b8-4f64-b55d-d3a93be86382" (UID: "7ece5540-61b8-4f64-b55d-d3a93be86382"). InnerVolumeSpecName "kube-api-access-fnnvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.709931 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-scripts" (OuterVolumeSpecName: "scripts") pod "7ece5540-61b8-4f64-b55d-d3a93be86382" (UID: "7ece5540-61b8-4f64-b55d-d3a93be86382"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.750795 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7ece5540-61b8-4f64-b55d-d3a93be86382" (UID: "7ece5540-61b8-4f64-b55d-d3a93be86382"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.767267 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7ece5540-61b8-4f64-b55d-d3a93be86382" (UID: "7ece5540-61b8-4f64-b55d-d3a93be86382"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.790965 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.791044 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.791065 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.791079 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.791091 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ece5540-61b8-4f64-b55d-d3a93be86382-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.791103 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnnvw\" (UniqueName: \"kubernetes.io/projected/7ece5540-61b8-4f64-b55d-d3a93be86382-kube-api-access-fnnvw\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.837172 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ece5540-61b8-4f64-b55d-d3a93be86382" (UID: "7ece5540-61b8-4f64-b55d-d3a93be86382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.854459 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-config-data" (OuterVolumeSpecName: "config-data") pod "7ece5540-61b8-4f64-b55d-d3a93be86382" (UID: "7ece5540-61b8-4f64-b55d-d3a93be86382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.892701 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:37 crc kubenswrapper[4907]: I1009 19:49:37.893068 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ece5540-61b8-4f64-b55d-d3a93be86382-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.246334 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ece5540-61b8-4f64-b55d-d3a93be86382","Type":"ContainerDied","Data":"abf6d1fd46ddc94455e30420751c8b48c44f61bb7f587ec5721e15f2416ced7f"} Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.246388 4907 scope.go:117] "RemoveContainer" containerID="51821cad36010203510dc30e22dc2e3f8e888516a73790af43b2d2817ab05de4" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.246547 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.272285 4907 scope.go:117] "RemoveContainer" containerID="3df386cada414505981df7b8db6dbc6c790994ac71a223083363d31e65859eb4" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.293815 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.301350 4907 scope.go:117] "RemoveContainer" containerID="fabfcc1179c1e16af17e3435771abec67ea9abf35a6740e22c4cd40b847bcd83" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.317513 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.323943 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:49:38 crc kubenswrapper[4907]: E1009 19:49:38.324946 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="ceilometer-notification-agent" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.324963 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="ceilometer-notification-agent" Oct 09 19:49:38 crc kubenswrapper[4907]: E1009 19:49:38.324976 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="ceilometer-central-agent" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.324982 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="ceilometer-central-agent" Oct 09 19:49:38 crc kubenswrapper[4907]: E1009 19:49:38.324990 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="sg-core" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.324997 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="sg-core" Oct 09 19:49:38 crc kubenswrapper[4907]: E1009 19:49:38.325031 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="proxy-httpd" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.325037 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="proxy-httpd" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.325301 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="proxy-httpd" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.325319 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="sg-core" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.325331 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="ceilometer-notification-agent" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.325349 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" containerName="ceilometer-central-agent" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.328739 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.328974 4907 scope.go:117] "RemoveContainer" containerID="eb5fcdc04ee77dd56e0078b836494394e233ace9cd3a774d3636a7a77e702697" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.332427 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.332678 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.333012 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.343592 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.407719 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bzjf\" (UniqueName: \"kubernetes.io/projected/93a1e245-baac-44c9-ba36-46e2af13f3ea-kube-api-access-9bzjf\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.407783 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.407821 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-scripts\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.407884 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.407905 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-config-data\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.407943 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.407979 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-run-httpd\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.408047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-log-httpd\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.509638 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.510419 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-run-httpd\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.510831 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-run-httpd\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.511626 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-log-httpd\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.511862 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bzjf\" (UniqueName: \"kubernetes.io/projected/93a1e245-baac-44c9-ba36-46e2af13f3ea-kube-api-access-9bzjf\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.511924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.511980 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-scripts\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.512052 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-config-data\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.512075 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.514216 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.515583 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.516024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.516668 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-log-httpd\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.518352 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-config-data\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.521241 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-scripts\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.539152 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bzjf\" (UniqueName: \"kubernetes.io/projected/93a1e245-baac-44c9-ba36-46e2af13f3ea-kube-api-access-9bzjf\") pod \"ceilometer-0\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " pod="openstack/ceilometer-0" Oct 09 19:49:38 crc kubenswrapper[4907]: I1009 19:49:38.691250 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 19:49:39 crc kubenswrapper[4907]: I1009 19:49:39.172758 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ece5540-61b8-4f64-b55d-d3a93be86382" path="/var/lib/kubelet/pods/7ece5540-61b8-4f64-b55d-d3a93be86382/volumes" Oct 09 19:49:39 crc kubenswrapper[4907]: I1009 19:49:39.274655 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 19:49:40 crc kubenswrapper[4907]: I1009 19:49:40.250608 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerName="rabbitmq" containerID="cri-o://1e10ff7f190df66a67d83fbbedaba5e91ed8f3fe46703f27d608f919ee5d8219" gracePeriod=604796 Oct 09 19:49:40 crc kubenswrapper[4907]: I1009 19:49:40.303753 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerStarted","Data":"1761bd048d74818883c24f60c9af82d4dd0098da9662192d637c12459191a82a"} Oct 09 19:49:40 crc kubenswrapper[4907]: I1009 19:49:40.886021 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Oct 09 19:49:41 crc kubenswrapper[4907]: I1009 19:49:41.155800 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerName="rabbitmq" containerID="cri-o://60569705daf757c38afdbbef43fbd00d1607a2a4646f9e9d0d488856f660de4b" gracePeriod=604796 Oct 09 19:49:41 crc kubenswrapper[4907]: I1009 19:49:41.174450 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Oct 09 19:49:47 crc kubenswrapper[4907]: I1009 19:49:47.379495 4907 generic.go:334] "Generic (PLEG): container finished" podID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerID="1e10ff7f190df66a67d83fbbedaba5e91ed8f3fe46703f27d608f919ee5d8219" exitCode=0 Oct 09 19:49:47 crc kubenswrapper[4907]: I1009 19:49:47.379590 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c","Type":"ContainerDied","Data":"1e10ff7f190df66a67d83fbbedaba5e91ed8f3fe46703f27d608f919ee5d8219"} Oct 09 19:49:48 crc kubenswrapper[4907]: I1009 19:49:48.393728 4907 generic.go:334] "Generic (PLEG): container finished" podID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerID="60569705daf757c38afdbbef43fbd00d1607a2a4646f9e9d0d488856f660de4b" exitCode=0 Oct 09 19:49:48 crc kubenswrapper[4907]: I1009 19:49:48.393772 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05cb258e-fa1a-4978-b143-d6c817ec0f96","Type":"ContainerDied","Data":"60569705daf757c38afdbbef43fbd00d1607a2a4646f9e9d0d488856f660de4b"} Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.333086 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jbp5l"] Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.334921 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.337030 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.345178 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jbp5l"] Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.432520 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.432578 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsxfz\" (UniqueName: \"kubernetes.io/projected/d724a204-d5a3-4294-b5b0-2e8640e4cce0-kube-api-access-bsxfz\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.432598 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.432793 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.432932 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.432967 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.433052 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-config\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.534576 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.534657 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsxfz\" (UniqueName: \"kubernetes.io/projected/d724a204-d5a3-4294-b5b0-2e8640e4cce0-kube-api-access-bsxfz\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.534686 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.534741 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.534796 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.534828 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.534868 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-config\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.535712 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.535752 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.536256 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.537119 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-config\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.537176 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.537180 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.556254 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsxfz\" (UniqueName: \"kubernetes.io/projected/d724a204-d5a3-4294-b5b0-2e8640e4cce0-kube-api-access-bsxfz\") pod \"dnsmasq-dns-67b789f86c-jbp5l\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:49 crc kubenswrapper[4907]: I1009 19:49:49.711297 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:50 crc kubenswrapper[4907]: I1009 19:49:50.885535 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Oct 09 19:49:51 crc kubenswrapper[4907]: E1009 19:49:51.551097 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.68:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest" Oct 09 19:49:51 crc kubenswrapper[4907]: E1009 19:49:51.551510 4907 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.68:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest" Oct 09 19:49:51 crc kubenswrapper[4907]: E1009 19:49:51.551716 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.68:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n575h696h648hfdh568h54h65dh8h59dh9bh549h656hbch688hddh567h588h57fh688hd7h86h5f6h557h5ffhc8h7fh6dh565h656h67h576h67fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9bzjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(93a1e245-baac-44c9-ba36-46e2af13f3ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.613668 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.625525 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673453 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-server-conf\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673517 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673562 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-config-data\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673589 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-tls\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673646 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-plugins-conf\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673686 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-erlang-cookie-secret\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673741 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-plugins\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673794 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-config-data\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673842 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-tls\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673873 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-confd\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673923 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-erlang-cookie\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673970 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05cb258e-fa1a-4978-b143-d6c817ec0f96-erlang-cookie-secret\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.673999 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-plugins\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674020 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsr5n\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-kube-api-access-jsr5n\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674045 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dn4c\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-kube-api-access-8dn4c\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674094 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674121 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-erlang-cookie\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674148 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05cb258e-fa1a-4978-b143-d6c817ec0f96-pod-info\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674180 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-confd\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674217 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-plugins-conf\") pod \"05cb258e-fa1a-4978-b143-d6c817ec0f96\" (UID: \"05cb258e-fa1a-4978-b143-d6c817ec0f96\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674244 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-server-conf\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.674292 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-pod-info\") pod \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\" (UID: \"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c\") " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.680685 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.683013 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.689838 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.690033 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-pod-info" (OuterVolumeSpecName: "pod-info") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.692423 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.692671 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-kube-api-access-jsr5n" (OuterVolumeSpecName: "kube-api-access-jsr5n") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "kube-api-access-jsr5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.694339 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.695595 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.709697 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.719812 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cb258e-fa1a-4978-b143-d6c817ec0f96-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.723720 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-kube-api-access-8dn4c" (OuterVolumeSpecName: "kube-api-access-8dn4c") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "kube-api-access-8dn4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.726704 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.726961 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.727044 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.735912 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.737537 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/05cb258e-fa1a-4978-b143-d6c817ec0f96-pod-info" (OuterVolumeSpecName: "pod-info") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.777589 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-config-data" (OuterVolumeSpecName: "config-data") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778800 4907 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778829 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778838 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778847 4907 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778855 4907 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778865 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778874 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778881 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778891 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778901 4907 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/05cb258e-fa1a-4978-b143-d6c817ec0f96-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778909 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778917 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsr5n\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-kube-api-access-jsr5n\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778926 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dn4c\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-kube-api-access-8dn4c\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778940 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778949 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778957 4907 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/05cb258e-fa1a-4978-b143-d6c817ec0f96-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.778966 4907 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.781119 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-server-conf" (OuterVolumeSpecName: "server-conf") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.818446 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.837361 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-server-conf" (OuterVolumeSpecName: "server-conf") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.841884 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.845043 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-config-data" (OuterVolumeSpecName: "config-data") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.884532 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.884604 4907 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.884615 4907 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/05cb258e-fa1a-4978-b143-d6c817ec0f96-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.884624 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.884633 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.886849 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" (UID: "45406d22-0dd2-4c14-b5f0-6f3c226c0f5c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.944872 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "05cb258e-fa1a-4978-b143-d6c817ec0f96" (UID: "05cb258e-fa1a-4978-b143-d6c817ec0f96"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.985846 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/05cb258e-fa1a-4978-b143-d6c817ec0f96-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:51 crc kubenswrapper[4907]: I1009 19:49:51.985876 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.049072 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jbp5l"] Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.463055 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45406d22-0dd2-4c14-b5f0-6f3c226c0f5c","Type":"ContainerDied","Data":"0f0d1b9d3a7ac5fff3ca4abbd0efc31641719d943ff2330c2266ca66855ae989"} Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.463489 4907 scope.go:117] "RemoveContainer" containerID="1e10ff7f190df66a67d83fbbedaba5e91ed8f3fe46703f27d608f919ee5d8219" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.463870 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.470725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" event={"ID":"d724a204-d5a3-4294-b5b0-2e8640e4cce0","Type":"ContainerStarted","Data":"385d49832b2de3809c8573ceb8b62651ae69fe69d16e514546703c55060ae0bc"} Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.470770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" event={"ID":"d724a204-d5a3-4294-b5b0-2e8640e4cce0","Type":"ContainerStarted","Data":"43b687cb274892c744dcdc0f7643e713ea47d86b1935a1d61779be1859bcbadb"} Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.473992 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"05cb258e-fa1a-4978-b143-d6c817ec0f96","Type":"ContainerDied","Data":"8db72d064294617ec75103545fc4ec9b3280e2046c3c55dd8578944fd51d188b"} Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.474077 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.500626 4907 scope.go:117] "RemoveContainer" containerID="6e458b11269c1cb13b676c476b97734eb508c7576d9ba6e4b28cb7cd48f34f70" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.542058 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.557109 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.590138 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.599800 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.607182 4907 scope.go:117] "RemoveContainer" containerID="60569705daf757c38afdbbef43fbd00d1607a2a4646f9e9d0d488856f660de4b" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.618264 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:49:52 crc kubenswrapper[4907]: E1009 19:49:52.620524 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerName="rabbitmq" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.620549 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerName="rabbitmq" Oct 09 19:49:52 crc kubenswrapper[4907]: E1009 19:49:52.620589 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerName="setup-container" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.620598 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerName="setup-container" Oct 09 19:49:52 crc kubenswrapper[4907]: E1009 19:49:52.620625 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerName="setup-container" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.620633 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerName="setup-container" Oct 09 19:49:52 crc kubenswrapper[4907]: E1009 19:49:52.620647 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerName="rabbitmq" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.620653 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerName="rabbitmq" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.620851 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerName="rabbitmq" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.620876 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" containerName="rabbitmq" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.623153 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.629428 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.629441 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.629614 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.629709 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5v9sj" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.629765 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.629793 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.629888 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.630112 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.640512 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.642354 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.645202 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.645603 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.645796 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.645929 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.646146 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.646220 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hprcd" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.645845 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.647339 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.654667 4907 scope.go:117] "RemoveContainer" containerID="90127cac8301cedbb378bf90ca8583c35071b8c22a6f7cf9a0d8fbe7ef6cfe53" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.710230 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.710287 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e022a77-723b-47bc-98c5-ad7c72aab0c3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.710329 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.710358 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.710386 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.710406 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.710431 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5tgj\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-kube-api-access-x5tgj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.710501 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-config-data\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711355 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711396 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711427 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711457 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711514 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/274be987-64b1-4406-9f04-c81fe651d851-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711552 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711589 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711622 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/274be987-64b1-4406-9f04-c81fe651d851-pod-info\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.711658 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.713317 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e022a77-723b-47bc-98c5-ad7c72aab0c3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.713376 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.713402 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2l5w\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-kube-api-access-n2l5w\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.713426 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.713447 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-server-conf\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817080 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817118 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817156 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817178 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/274be987-64b1-4406-9f04-c81fe651d851-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817231 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817252 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/274be987-64b1-4406-9f04-c81fe651d851-pod-info\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817295 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817313 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e022a77-723b-47bc-98c5-ad7c72aab0c3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817331 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2l5w\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-kube-api-access-n2l5w\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817356 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817371 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817386 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-server-conf\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817421 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817439 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e022a77-723b-47bc-98c5-ad7c72aab0c3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817461 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817521 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817543 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817559 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817576 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5tgj\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-kube-api-access-x5tgj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.817608 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-config-data\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.818449 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-config-data\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.818698 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.819533 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.819662 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.820231 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.822098 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.822848 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.823290 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.824012 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.824781 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.825790 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e022a77-723b-47bc-98c5-ad7c72aab0c3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.826840 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.827652 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/274be987-64b1-4406-9f04-c81fe651d851-server-conf\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.828325 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e022a77-723b-47bc-98c5-ad7c72aab0c3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.829249 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/274be987-64b1-4406-9f04-c81fe651d851-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.836310 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e022a77-723b-47bc-98c5-ad7c72aab0c3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.837266 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/274be987-64b1-4406-9f04-c81fe651d851-pod-info\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.837350 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.840231 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.841273 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2l5w\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-kube-api-access-n2l5w\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.846046 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/274be987-64b1-4406-9f04-c81fe651d851-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.850314 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5tgj\" (UniqueName: \"kubernetes.io/projected/4e022a77-723b-47bc-98c5-ad7c72aab0c3-kube-api-access-x5tgj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.874381 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4e022a77-723b-47bc-98c5-ad7c72aab0c3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.907558 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"274be987-64b1-4406-9f04-c81fe651d851\") " pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.968018 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 19:49:52 crc kubenswrapper[4907]: I1009 19:49:52.988079 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:49:53 crc kubenswrapper[4907]: I1009 19:49:53.165633 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" path="/var/lib/kubelet/pods/05cb258e-fa1a-4978-b143-d6c817ec0f96/volumes" Oct 09 19:49:53 crc kubenswrapper[4907]: I1009 19:49:53.177026 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45406d22-0dd2-4c14-b5f0-6f3c226c0f5c" path="/var/lib/kubelet/pods/45406d22-0dd2-4c14-b5f0-6f3c226c0f5c/volumes" Oct 09 19:49:53 crc kubenswrapper[4907]: I1009 19:49:53.477175 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 19:49:53 crc kubenswrapper[4907]: W1009 19:49:53.479765 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274be987_64b1_4406_9f04_c81fe651d851.slice/crio-e3f10aba4eebdbf3768307f32741adf74d04a39658be62bd4d0b05bfa79886c5 WatchSource:0}: Error finding container e3f10aba4eebdbf3768307f32741adf74d04a39658be62bd4d0b05bfa79886c5: Status 404 returned error can't find the container with id e3f10aba4eebdbf3768307f32741adf74d04a39658be62bd4d0b05bfa79886c5 Oct 09 19:49:53 crc kubenswrapper[4907]: I1009 19:49:53.491715 4907 generic.go:334] "Generic (PLEG): container finished" podID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" containerID="385d49832b2de3809c8573ceb8b62651ae69fe69d16e514546703c55060ae0bc" exitCode=0 Oct 09 19:49:53 crc kubenswrapper[4907]: I1009 19:49:53.491787 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" event={"ID":"d724a204-d5a3-4294-b5b0-2e8640e4cce0","Type":"ContainerDied","Data":"385d49832b2de3809c8573ceb8b62651ae69fe69d16e514546703c55060ae0bc"} Oct 09 19:49:53 crc kubenswrapper[4907]: I1009 19:49:53.500379 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerStarted","Data":"79761f4f7288208cdb015650c604aff38ff244c4833d0aa6aaef8971055ac638"} Oct 09 19:49:53 crc kubenswrapper[4907]: I1009 19:49:53.563637 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 19:49:53 crc kubenswrapper[4907]: W1009 19:49:53.566550 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e022a77_723b_47bc_98c5_ad7c72aab0c3.slice/crio-acebb734df0542957ff9bc94ae09d7353219ca2ad44ea285ca97d314d4e311c9 WatchSource:0}: Error finding container acebb734df0542957ff9bc94ae09d7353219ca2ad44ea285ca97d314d4e311c9: Status 404 returned error can't find the container with id acebb734df0542957ff9bc94ae09d7353219ca2ad44ea285ca97d314d4e311c9 Oct 09 19:49:54 crc kubenswrapper[4907]: I1009 19:49:54.511273 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" event={"ID":"d724a204-d5a3-4294-b5b0-2e8640e4cce0","Type":"ContainerStarted","Data":"1b4fbb182decba1dbe038bfe0ae2a01fcdcb2deda3639a2dc87cba9be29ed437"} Oct 09 19:49:54 crc kubenswrapper[4907]: I1009 19:49:54.511817 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:54 crc kubenswrapper[4907]: I1009 19:49:54.512813 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"274be987-64b1-4406-9f04-c81fe651d851","Type":"ContainerStarted","Data":"e3f10aba4eebdbf3768307f32741adf74d04a39658be62bd4d0b05bfa79886c5"} Oct 09 19:49:54 crc kubenswrapper[4907]: I1009 19:49:54.518784 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerStarted","Data":"b59fef055a607abe26d6dd7b4e6c0f42fb27cc1dba53968467e95667208afcbb"} Oct 09 19:49:54 crc kubenswrapper[4907]: I1009 19:49:54.523201 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4e022a77-723b-47bc-98c5-ad7c72aab0c3","Type":"ContainerStarted","Data":"acebb734df0542957ff9bc94ae09d7353219ca2ad44ea285ca97d314d4e311c9"} Oct 09 19:49:54 crc kubenswrapper[4907]: I1009 19:49:54.551852 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" podStartSLOduration=5.551814531 podStartE2EDuration="5.551814531s" podCreationTimestamp="2025-10-09 19:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:49:54.53078558 +0000 UTC m=+1280.062753089" watchObservedRunningTime="2025-10-09 19:49:54.551814531 +0000 UTC m=+1280.083782020" Oct 09 19:49:55 crc kubenswrapper[4907]: I1009 19:49:55.542032 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4e022a77-723b-47bc-98c5-ad7c72aab0c3","Type":"ContainerStarted","Data":"177ca5bc3bd314ddfd08c8340e265cce9111d6df6177f5ed70cb97b5a69c595f"} Oct 09 19:49:55 crc kubenswrapper[4907]: I1009 19:49:55.545539 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"274be987-64b1-4406-9f04-c81fe651d851","Type":"ContainerStarted","Data":"6cd62c5eb271e58d184e57f85975434fb7e4c80b986c994c58fd0a98035c6b3e"} Oct 09 19:49:55 crc kubenswrapper[4907]: E1009 19:49:55.921443 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" Oct 09 19:49:56 crc kubenswrapper[4907]: I1009 19:49:56.175445 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="05cb258e-fa1a-4978-b143-d6c817ec0f96" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: i/o timeout" Oct 09 19:49:56 crc kubenswrapper[4907]: I1009 19:49:56.558234 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerStarted","Data":"b52a33ba048abbb2ee4170dc75c120902fca79f2296ad600b7dbc6b25f6bf192"} Oct 09 19:49:56 crc kubenswrapper[4907]: I1009 19:49:56.558304 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 19:49:56 crc kubenswrapper[4907]: E1009 19:49:56.558814 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.68:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest\\\"\"" pod="openstack/ceilometer-0" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" Oct 09 19:49:57 crc kubenswrapper[4907]: E1009 19:49:57.568894 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.68:5001/podified-master-centos10/openstack-ceilometer-central:telemetry_latest\\\"\"" pod="openstack/ceilometer-0" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" Oct 09 19:49:59 crc kubenswrapper[4907]: I1009 19:49:59.712666 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:49:59 crc kubenswrapper[4907]: I1009 19:49:59.803607 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-br6lt"] Oct 09 19:49:59 crc kubenswrapper[4907]: I1009 19:49:59.803869 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" podUID="c47a2842-7ddb-4676-9d20-f507044a2b76" containerName="dnsmasq-dns" containerID="cri-o://f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773" gracePeriod=10 Oct 09 19:49:59 crc kubenswrapper[4907]: I1009 19:49:59.987624 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-gbnql"] Oct 09 19:49:59 crc kubenswrapper[4907]: I1009 19:49:59.991375 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.060640 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-gbnql"] Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.077665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.077754 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.077782 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-config\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.077804 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.077825 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.077958 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tg4\" (UniqueName: \"kubernetes.io/projected/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-kube-api-access-k5tg4\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.077990 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.180926 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.180992 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-config\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.181019 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.181039 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.181152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tg4\" (UniqueName: \"kubernetes.io/projected/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-kube-api-access-k5tg4\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.181185 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.181258 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.182331 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.182616 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.182992 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.183384 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.184048 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.184153 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-config\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.208332 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5tg4\" (UniqueName: \"kubernetes.io/projected/1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e-kube-api-access-k5tg4\") pod \"dnsmasq-dns-cb6ffcf87-gbnql\" (UID: \"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.365768 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.371128 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.492939 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-config\") pod \"c47a2842-7ddb-4676-9d20-f507044a2b76\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.493058 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg29m\" (UniqueName: \"kubernetes.io/projected/c47a2842-7ddb-4676-9d20-f507044a2b76-kube-api-access-sg29m\") pod \"c47a2842-7ddb-4676-9d20-f507044a2b76\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.493142 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-swift-storage-0\") pod \"c47a2842-7ddb-4676-9d20-f507044a2b76\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.493216 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-sb\") pod \"c47a2842-7ddb-4676-9d20-f507044a2b76\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.493400 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-svc\") pod \"c47a2842-7ddb-4676-9d20-f507044a2b76\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.493505 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-nb\") pod \"c47a2842-7ddb-4676-9d20-f507044a2b76\" (UID: \"c47a2842-7ddb-4676-9d20-f507044a2b76\") " Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.505271 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47a2842-7ddb-4676-9d20-f507044a2b76-kube-api-access-sg29m" (OuterVolumeSpecName: "kube-api-access-sg29m") pod "c47a2842-7ddb-4676-9d20-f507044a2b76" (UID: "c47a2842-7ddb-4676-9d20-f507044a2b76"). InnerVolumeSpecName "kube-api-access-sg29m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.554307 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c47a2842-7ddb-4676-9d20-f507044a2b76" (UID: "c47a2842-7ddb-4676-9d20-f507044a2b76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.565456 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c47a2842-7ddb-4676-9d20-f507044a2b76" (UID: "c47a2842-7ddb-4676-9d20-f507044a2b76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.569129 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c47a2842-7ddb-4676-9d20-f507044a2b76" (UID: "c47a2842-7ddb-4676-9d20-f507044a2b76"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.576109 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-config" (OuterVolumeSpecName: "config") pod "c47a2842-7ddb-4676-9d20-f507044a2b76" (UID: "c47a2842-7ddb-4676-9d20-f507044a2b76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.606026 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c47a2842-7ddb-4676-9d20-f507044a2b76" (UID: "c47a2842-7ddb-4676-9d20-f507044a2b76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.607211 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.607232 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.607241 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.607251 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.607259 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47a2842-7ddb-4676-9d20-f507044a2b76-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.607267 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg29m\" (UniqueName: \"kubernetes.io/projected/c47a2842-7ddb-4676-9d20-f507044a2b76-kube-api-access-sg29m\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.658418 4907 generic.go:334] "Generic (PLEG): container finished" podID="c47a2842-7ddb-4676-9d20-f507044a2b76" containerID="f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773" exitCode=0 Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.658461 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" event={"ID":"c47a2842-7ddb-4676-9d20-f507044a2b76","Type":"ContainerDied","Data":"f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773"} Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.658508 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" event={"ID":"c47a2842-7ddb-4676-9d20-f507044a2b76","Type":"ContainerDied","Data":"1a71ecf73900304c3f3e68bfbc6138f241f689ee1902990e6200017ab3aefed1"} Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.658526 4907 scope.go:117] "RemoveContainer" containerID="f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.658627 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-br6lt" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.694689 4907 scope.go:117] "RemoveContainer" containerID="33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.734799 4907 scope.go:117] "RemoveContainer" containerID="f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773" Oct 09 19:50:00 crc kubenswrapper[4907]: E1009 19:50:00.735085 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773\": container with ID starting with f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773 not found: ID does not exist" containerID="f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.735109 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773"} err="failed to get container status \"f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773\": rpc error: code = NotFound desc = could not find container \"f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773\": container with ID starting with f8cdb1f34dc91bd1f149f6ccbb7f680a12842ce8a8713ef03ae6481e2ec74773 not found: ID does not exist" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.735126 4907 scope.go:117] "RemoveContainer" containerID="33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533" Oct 09 19:50:00 crc kubenswrapper[4907]: E1009 19:50:00.735280 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533\": container with ID starting with 33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533 not found: ID does not exist" containerID="33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.735294 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533"} err="failed to get container status \"33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533\": rpc error: code = NotFound desc = could not find container \"33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533\": container with ID starting with 33f27247ea24779a593fd844de9ee41b3b0e1740e379597352f366e9f65e7533 not found: ID does not exist" Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.752514 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-br6lt"] Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.770171 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-br6lt"] Oct 09 19:50:00 crc kubenswrapper[4907]: I1009 19:50:00.839903 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-gbnql"] Oct 09 19:50:00 crc kubenswrapper[4907]: W1009 19:50:00.840419 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ff92e37_5fad_4bc3_954f_4cf7cc3f6b9e.slice/crio-9ac103408005897f083952784c0cf4eb5868cbae5bf64e25ee6bf30d698e59d8 WatchSource:0}: Error finding container 9ac103408005897f083952784c0cf4eb5868cbae5bf64e25ee6bf30d698e59d8: Status 404 returned error can't find the container with id 9ac103408005897f083952784c0cf4eb5868cbae5bf64e25ee6bf30d698e59d8 Oct 09 19:50:01 crc kubenswrapper[4907]: I1009 19:50:01.175836 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47a2842-7ddb-4676-9d20-f507044a2b76" path="/var/lib/kubelet/pods/c47a2842-7ddb-4676-9d20-f507044a2b76/volumes" Oct 09 19:50:01 crc kubenswrapper[4907]: I1009 19:50:01.670287 4907 generic.go:334] "Generic (PLEG): container finished" podID="1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e" containerID="55804352d13157e97d6d9ab74eccc73e4e1935e7bdcfd9fe7ad7c474419b5e66" exitCode=0 Oct 09 19:50:01 crc kubenswrapper[4907]: I1009 19:50:01.670396 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" event={"ID":"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e","Type":"ContainerDied","Data":"55804352d13157e97d6d9ab74eccc73e4e1935e7bdcfd9fe7ad7c474419b5e66"} Oct 09 19:50:01 crc kubenswrapper[4907]: I1009 19:50:01.670449 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" event={"ID":"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e","Type":"ContainerStarted","Data":"9ac103408005897f083952784c0cf4eb5868cbae5bf64e25ee6bf30d698e59d8"} Oct 09 19:50:02 crc kubenswrapper[4907]: I1009 19:50:02.684584 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" event={"ID":"1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e","Type":"ContainerStarted","Data":"04c5144ac57f5c518feceb449a6c1e56a83e2fcc2bd36714d3b27c27c42e654b"} Oct 09 19:50:02 crc kubenswrapper[4907]: I1009 19:50:02.685289 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:02 crc kubenswrapper[4907]: I1009 19:50:02.715607 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" podStartSLOduration=3.715590449 podStartE2EDuration="3.715590449s" podCreationTimestamp="2025-10-09 19:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:50:02.714497882 +0000 UTC m=+1288.246465381" watchObservedRunningTime="2025-10-09 19:50:02.715590449 +0000 UTC m=+1288.247557938" Oct 09 19:50:06 crc kubenswrapper[4907]: I1009 19:50:06.299531 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:50:06 crc kubenswrapper[4907]: I1009 19:50:06.299942 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:50:08 crc kubenswrapper[4907]: I1009 19:50:08.701754 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 19:50:09 crc kubenswrapper[4907]: I1009 19:50:09.747861 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerStarted","Data":"2b92bb2c6b19233b35f0cd7a26509483b6c9f9115bb5d2c4b567a12151e04c1e"} Oct 09 19:50:09 crc kubenswrapper[4907]: I1009 19:50:09.771838 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.293179728 podStartE2EDuration="31.771817298s" podCreationTimestamp="2025-10-09 19:49:38 +0000 UTC" firstStartedPulling="2025-10-09 19:49:39.303140658 +0000 UTC m=+1264.835108147" lastFinishedPulling="2025-10-09 19:50:08.781778228 +0000 UTC m=+1294.313745717" observedRunningTime="2025-10-09 19:50:09.767215176 +0000 UTC m=+1295.299182675" watchObservedRunningTime="2025-10-09 19:50:09.771817298 +0000 UTC m=+1295.303784787" Oct 09 19:50:10 crc kubenswrapper[4907]: I1009 19:50:10.367619 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-gbnql" Oct 09 19:50:10 crc kubenswrapper[4907]: I1009 19:50:10.432766 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jbp5l"] Oct 09 19:50:10 crc kubenswrapper[4907]: I1009 19:50:10.432984 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" podUID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" containerName="dnsmasq-dns" containerID="cri-o://1b4fbb182decba1dbe038bfe0ae2a01fcdcb2deda3639a2dc87cba9be29ed437" gracePeriod=10 Oct 09 19:50:10 crc kubenswrapper[4907]: I1009 19:50:10.760679 4907 generic.go:334] "Generic (PLEG): container finished" podID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" containerID="1b4fbb182decba1dbe038bfe0ae2a01fcdcb2deda3639a2dc87cba9be29ed437" exitCode=0 Oct 09 19:50:10 crc kubenswrapper[4907]: I1009 19:50:10.760725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" event={"ID":"d724a204-d5a3-4294-b5b0-2e8640e4cce0","Type":"ContainerDied","Data":"1b4fbb182decba1dbe038bfe0ae2a01fcdcb2deda3639a2dc87cba9be29ed437"} Oct 09 19:50:10 crc kubenswrapper[4907]: I1009 19:50:10.963559 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.119578 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-sb\") pod \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.119701 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-swift-storage-0\") pod \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.120564 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-nb\") pod \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.121499 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-openstack-edpm-ipam\") pod \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.121551 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-svc\") pod \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.121607 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-config\") pod \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.121632 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsxfz\" (UniqueName: \"kubernetes.io/projected/d724a204-d5a3-4294-b5b0-2e8640e4cce0-kube-api-access-bsxfz\") pod \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\" (UID: \"d724a204-d5a3-4294-b5b0-2e8640e4cce0\") " Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.135730 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d724a204-d5a3-4294-b5b0-2e8640e4cce0-kube-api-access-bsxfz" (OuterVolumeSpecName: "kube-api-access-bsxfz") pod "d724a204-d5a3-4294-b5b0-2e8640e4cce0" (UID: "d724a204-d5a3-4294-b5b0-2e8640e4cce0"). InnerVolumeSpecName "kube-api-access-bsxfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.203100 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d724a204-d5a3-4294-b5b0-2e8640e4cce0" (UID: "d724a204-d5a3-4294-b5b0-2e8640e4cce0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.215819 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d724a204-d5a3-4294-b5b0-2e8640e4cce0" (UID: "d724a204-d5a3-4294-b5b0-2e8640e4cce0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.221196 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-config" (OuterVolumeSpecName: "config") pod "d724a204-d5a3-4294-b5b0-2e8640e4cce0" (UID: "d724a204-d5a3-4294-b5b0-2e8640e4cce0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.222184 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d724a204-d5a3-4294-b5b0-2e8640e4cce0" (UID: "d724a204-d5a3-4294-b5b0-2e8640e4cce0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.224805 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d724a204-d5a3-4294-b5b0-2e8640e4cce0" (UID: "d724a204-d5a3-4294-b5b0-2e8640e4cce0"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.226156 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-config\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.226186 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsxfz\" (UniqueName: \"kubernetes.io/projected/d724a204-d5a3-4294-b5b0-2e8640e4cce0-kube-api-access-bsxfz\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.226203 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.226215 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.226227 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.226238 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.227541 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d724a204-d5a3-4294-b5b0-2e8640e4cce0" (UID: "d724a204-d5a3-4294-b5b0-2e8640e4cce0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.327931 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d724a204-d5a3-4294-b5b0-2e8640e4cce0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.772319 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" event={"ID":"d724a204-d5a3-4294-b5b0-2e8640e4cce0","Type":"ContainerDied","Data":"43b687cb274892c744dcdc0f7643e713ea47d86b1935a1d61779be1859bcbadb"} Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.772376 4907 scope.go:117] "RemoveContainer" containerID="1b4fbb182decba1dbe038bfe0ae2a01fcdcb2deda3639a2dc87cba9be29ed437" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.772399 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jbp5l" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.808458 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jbp5l"] Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.809255 4907 scope.go:117] "RemoveContainer" containerID="385d49832b2de3809c8573ceb8b62651ae69fe69d16e514546703c55060ae0bc" Oct 09 19:50:11 crc kubenswrapper[4907]: I1009 19:50:11.820922 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jbp5l"] Oct 09 19:50:13 crc kubenswrapper[4907]: I1009 19:50:13.186115 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" path="/var/lib/kubelet/pods/d724a204-d5a3-4294-b5b0-2e8640e4cce0/volumes" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.592941 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf"] Oct 09 19:50:23 crc kubenswrapper[4907]: E1009 19:50:23.593961 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" containerName="dnsmasq-dns" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.593978 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" containerName="dnsmasq-dns" Oct 09 19:50:23 crc kubenswrapper[4907]: E1009 19:50:23.593991 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" containerName="init" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.593998 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" containerName="init" Oct 09 19:50:23 crc kubenswrapper[4907]: E1009 19:50:23.594027 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47a2842-7ddb-4676-9d20-f507044a2b76" containerName="dnsmasq-dns" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.594034 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47a2842-7ddb-4676-9d20-f507044a2b76" containerName="dnsmasq-dns" Oct 09 19:50:23 crc kubenswrapper[4907]: E1009 19:50:23.594046 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47a2842-7ddb-4676-9d20-f507044a2b76" containerName="init" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.594055 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47a2842-7ddb-4676-9d20-f507044a2b76" containerName="init" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.594244 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d724a204-d5a3-4294-b5b0-2e8640e4cce0" containerName="dnsmasq-dns" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.594262 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c47a2842-7ddb-4676-9d20-f507044a2b76" containerName="dnsmasq-dns" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.595074 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.597442 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.597492 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.597529 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.598155 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.603848 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf"] Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.662519 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.662557 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5zww\" (UniqueName: \"kubernetes.io/projected/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-kube-api-access-q5zww\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.662649 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.662695 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.764882 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.765218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.765397 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.765564 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5zww\" (UniqueName: \"kubernetes.io/projected/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-kube-api-access-q5zww\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.771334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.772777 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.774154 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.785818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5zww\" (UniqueName: \"kubernetes.io/projected/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-kube-api-access-q5zww\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:23 crc kubenswrapper[4907]: I1009 19:50:23.919601 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:24 crc kubenswrapper[4907]: I1009 19:50:24.465640 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf"] Oct 09 19:50:24 crc kubenswrapper[4907]: I1009 19:50:24.906806 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" event={"ID":"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a","Type":"ContainerStarted","Data":"e7682eb755f1d31c9c9f6512572354280bdc0f0d0eef90c650a65d07a594f5f4"} Oct 09 19:50:27 crc kubenswrapper[4907]: I1009 19:50:27.934449 4907 generic.go:334] "Generic (PLEG): container finished" podID="4e022a77-723b-47bc-98c5-ad7c72aab0c3" containerID="177ca5bc3bd314ddfd08c8340e265cce9111d6df6177f5ed70cb97b5a69c595f" exitCode=0 Oct 09 19:50:27 crc kubenswrapper[4907]: I1009 19:50:27.934542 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4e022a77-723b-47bc-98c5-ad7c72aab0c3","Type":"ContainerDied","Data":"177ca5bc3bd314ddfd08c8340e265cce9111d6df6177f5ed70cb97b5a69c595f"} Oct 09 19:50:27 crc kubenswrapper[4907]: I1009 19:50:27.937986 4907 generic.go:334] "Generic (PLEG): container finished" podID="274be987-64b1-4406-9f04-c81fe651d851" containerID="6cd62c5eb271e58d184e57f85975434fb7e4c80b986c994c58fd0a98035c6b3e" exitCode=0 Oct 09 19:50:27 crc kubenswrapper[4907]: I1009 19:50:27.938024 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"274be987-64b1-4406-9f04-c81fe651d851","Type":"ContainerDied","Data":"6cd62c5eb271e58d184e57f85975434fb7e4c80b986c994c58fd0a98035c6b3e"} Oct 09 19:50:35 crc kubenswrapper[4907]: I1009 19:50:35.025031 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4e022a77-723b-47bc-98c5-ad7c72aab0c3","Type":"ContainerStarted","Data":"fd37046b93b614ddb3e7fa7643c2b01b022712d472f9057e0d3bb8ae83271bb1"} Oct 09 19:50:35 crc kubenswrapper[4907]: I1009 19:50:35.025726 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:50:35 crc kubenswrapper[4907]: I1009 19:50:35.027585 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"274be987-64b1-4406-9f04-c81fe651d851","Type":"ContainerStarted","Data":"346478ac1bd1d4512ada12a457a31aa4c3059d5568df6f9c1b265438ec877810"} Oct 09 19:50:35 crc kubenswrapper[4907]: I1009 19:50:35.028358 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 19:50:35 crc kubenswrapper[4907]: I1009 19:50:35.029999 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" event={"ID":"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a","Type":"ContainerStarted","Data":"d3204428f12359ad1f42a348d679b8c62f037a7fc4bde4174e1a6ad0207204b3"} Oct 09 19:50:35 crc kubenswrapper[4907]: I1009 19:50:35.050687 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.050669223 podStartE2EDuration="43.050669223s" podCreationTimestamp="2025-10-09 19:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:50:35.048710885 +0000 UTC m=+1320.580678384" watchObservedRunningTime="2025-10-09 19:50:35.050669223 +0000 UTC m=+1320.582636712" Oct 09 19:50:35 crc kubenswrapper[4907]: I1009 19:50:35.071557 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" podStartSLOduration=2.45390585 podStartE2EDuration="12.071532241s" podCreationTimestamp="2025-10-09 19:50:23 +0000 UTC" firstStartedPulling="2025-10-09 19:50:24.481818057 +0000 UTC m=+1310.013785546" lastFinishedPulling="2025-10-09 19:50:34.099444448 +0000 UTC m=+1319.631411937" observedRunningTime="2025-10-09 19:50:35.066610631 +0000 UTC m=+1320.598578130" watchObservedRunningTime="2025-10-09 19:50:35.071532241 +0000 UTC m=+1320.603499730" Oct 09 19:50:35 crc kubenswrapper[4907]: I1009 19:50:35.115297 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=43.115276485 podStartE2EDuration="43.115276485s" podCreationTimestamp="2025-10-09 19:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 19:50:35.089473727 +0000 UTC m=+1320.621441236" watchObservedRunningTime="2025-10-09 19:50:35.115276485 +0000 UTC m=+1320.647243974" Oct 09 19:50:36 crc kubenswrapper[4907]: I1009 19:50:36.298875 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:50:36 crc kubenswrapper[4907]: I1009 19:50:36.299208 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:50:46 crc kubenswrapper[4907]: I1009 19:50:46.170769 4907 generic.go:334] "Generic (PLEG): container finished" podID="82707ebf-1ae1-4a8e-b3a3-bff2e91e707a" containerID="d3204428f12359ad1f42a348d679b8c62f037a7fc4bde4174e1a6ad0207204b3" exitCode=0 Oct 09 19:50:46 crc kubenswrapper[4907]: I1009 19:50:46.170897 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" event={"ID":"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a","Type":"ContainerDied","Data":"d3204428f12359ad1f42a348d679b8c62f037a7fc4bde4174e1a6ad0207204b3"} Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.699277 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.857341 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-repo-setup-combined-ca-bundle\") pod \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.857437 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5zww\" (UniqueName: \"kubernetes.io/projected/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-kube-api-access-q5zww\") pod \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.857490 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-inventory\") pod \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.857529 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-ssh-key\") pod \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\" (UID: \"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a\") " Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.864648 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "82707ebf-1ae1-4a8e-b3a3-bff2e91e707a" (UID: "82707ebf-1ae1-4a8e-b3a3-bff2e91e707a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.866634 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-kube-api-access-q5zww" (OuterVolumeSpecName: "kube-api-access-q5zww") pod "82707ebf-1ae1-4a8e-b3a3-bff2e91e707a" (UID: "82707ebf-1ae1-4a8e-b3a3-bff2e91e707a"). InnerVolumeSpecName "kube-api-access-q5zww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.887655 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "82707ebf-1ae1-4a8e-b3a3-bff2e91e707a" (UID: "82707ebf-1ae1-4a8e-b3a3-bff2e91e707a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.906420 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-inventory" (OuterVolumeSpecName: "inventory") pod "82707ebf-1ae1-4a8e-b3a3-bff2e91e707a" (UID: "82707ebf-1ae1-4a8e-b3a3-bff2e91e707a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.959778 4907 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.959835 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5zww\" (UniqueName: \"kubernetes.io/projected/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-kube-api-access-q5zww\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.959849 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:47 crc kubenswrapper[4907]: I1009 19:50:47.959863 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/82707ebf-1ae1-4a8e-b3a3-bff2e91e707a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.196911 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" event={"ID":"82707ebf-1ae1-4a8e-b3a3-bff2e91e707a","Type":"ContainerDied","Data":"e7682eb755f1d31c9c9f6512572354280bdc0f0d0eef90c650a65d07a594f5f4"} Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.196954 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7682eb755f1d31c9c9f6512572354280bdc0f0d0eef90c650a65d07a594f5f4" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.197320 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.260882 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv"] Oct 09 19:50:48 crc kubenswrapper[4907]: E1009 19:50:48.261367 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82707ebf-1ae1-4a8e-b3a3-bff2e91e707a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.261390 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="82707ebf-1ae1-4a8e-b3a3-bff2e91e707a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.261651 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="82707ebf-1ae1-4a8e-b3a3-bff2e91e707a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.262489 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.265381 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.265951 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.266098 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.266310 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.286386 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv"] Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.365412 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.365523 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcqrz\" (UniqueName: \"kubernetes.io/projected/0de1fb87-8f71-4b57-af90-3568d238da35-kube-api-access-zcqrz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.365603 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.467265 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.467365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.467429 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcqrz\" (UniqueName: \"kubernetes.io/projected/0de1fb87-8f71-4b57-af90-3568d238da35-kube-api-access-zcqrz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.470735 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.471992 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.490786 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcqrz\" (UniqueName: \"kubernetes.io/projected/0de1fb87-8f71-4b57-af90-3568d238da35-kube-api-access-zcqrz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-8zclv\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:48 crc kubenswrapper[4907]: I1009 19:50:48.581589 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:49 crc kubenswrapper[4907]: I1009 19:50:49.127650 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv"] Oct 09 19:50:49 crc kubenswrapper[4907]: W1009 19:50:49.129922 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0de1fb87_8f71_4b57_af90_3568d238da35.slice/crio-08adcdbaed2022e4f7156c30dd51a1bde1945db447eb4a5a0b860086d86ed5be WatchSource:0}: Error finding container 08adcdbaed2022e4f7156c30dd51a1bde1945db447eb4a5a0b860086d86ed5be: Status 404 returned error can't find the container with id 08adcdbaed2022e4f7156c30dd51a1bde1945db447eb4a5a0b860086d86ed5be Oct 09 19:50:49 crc kubenswrapper[4907]: I1009 19:50:49.207319 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" event={"ID":"0de1fb87-8f71-4b57-af90-3568d238da35","Type":"ContainerStarted","Data":"08adcdbaed2022e4f7156c30dd51a1bde1945db447eb4a5a0b860086d86ed5be"} Oct 09 19:50:51 crc kubenswrapper[4907]: I1009 19:50:51.229582 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" event={"ID":"0de1fb87-8f71-4b57-af90-3568d238da35","Type":"ContainerStarted","Data":"54e04dfa9a1817aee759e49913019026dfbd3c70718d664bfba52abc51ceb69c"} Oct 09 19:50:51 crc kubenswrapper[4907]: I1009 19:50:51.250742 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" podStartSLOduration=2.373008068 podStartE2EDuration="3.250726754s" podCreationTimestamp="2025-10-09 19:50:48 +0000 UTC" firstStartedPulling="2025-10-09 19:50:49.131966912 +0000 UTC m=+1334.663934401" lastFinishedPulling="2025-10-09 19:50:50.009685588 +0000 UTC m=+1335.541653087" observedRunningTime="2025-10-09 19:50:51.243864838 +0000 UTC m=+1336.775832357" watchObservedRunningTime="2025-10-09 19:50:51.250726754 +0000 UTC m=+1336.782694233" Oct 09 19:50:52 crc kubenswrapper[4907]: I1009 19:50:52.970673 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 19:50:52 crc kubenswrapper[4907]: I1009 19:50:52.991638 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 19:50:53 crc kubenswrapper[4907]: I1009 19:50:53.249593 4907 generic.go:334] "Generic (PLEG): container finished" podID="0de1fb87-8f71-4b57-af90-3568d238da35" containerID="54e04dfa9a1817aee759e49913019026dfbd3c70718d664bfba52abc51ceb69c" exitCode=0 Oct 09 19:50:53 crc kubenswrapper[4907]: I1009 19:50:53.249640 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" event={"ID":"0de1fb87-8f71-4b57-af90-3568d238da35","Type":"ContainerDied","Data":"54e04dfa9a1817aee759e49913019026dfbd3c70718d664bfba52abc51ceb69c"} Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.662970 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.695280 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcqrz\" (UniqueName: \"kubernetes.io/projected/0de1fb87-8f71-4b57-af90-3568d238da35-kube-api-access-zcqrz\") pod \"0de1fb87-8f71-4b57-af90-3568d238da35\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.695325 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-ssh-key\") pod \"0de1fb87-8f71-4b57-af90-3568d238da35\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.695363 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-inventory\") pod \"0de1fb87-8f71-4b57-af90-3568d238da35\" (UID: \"0de1fb87-8f71-4b57-af90-3568d238da35\") " Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.700856 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0de1fb87-8f71-4b57-af90-3568d238da35-kube-api-access-zcqrz" (OuterVolumeSpecName: "kube-api-access-zcqrz") pod "0de1fb87-8f71-4b57-af90-3568d238da35" (UID: "0de1fb87-8f71-4b57-af90-3568d238da35"). InnerVolumeSpecName "kube-api-access-zcqrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.724307 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0de1fb87-8f71-4b57-af90-3568d238da35" (UID: "0de1fb87-8f71-4b57-af90-3568d238da35"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.748727 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-inventory" (OuterVolumeSpecName: "inventory") pod "0de1fb87-8f71-4b57-af90-3568d238da35" (UID: "0de1fb87-8f71-4b57-af90-3568d238da35"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.797477 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcqrz\" (UniqueName: \"kubernetes.io/projected/0de1fb87-8f71-4b57-af90-3568d238da35-kube-api-access-zcqrz\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.797822 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:54 crc kubenswrapper[4907]: I1009 19:50:54.797837 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0de1fb87-8f71-4b57-af90-3568d238da35-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.291231 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" event={"ID":"0de1fb87-8f71-4b57-af90-3568d238da35","Type":"ContainerDied","Data":"08adcdbaed2022e4f7156c30dd51a1bde1945db447eb4a5a0b860086d86ed5be"} Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.291284 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08adcdbaed2022e4f7156c30dd51a1bde1945db447eb4a5a0b860086d86ed5be" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.291383 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-8zclv" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.410161 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt"] Oct 09 19:50:55 crc kubenswrapper[4907]: E1009 19:50:55.410577 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0de1fb87-8f71-4b57-af90-3568d238da35" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.410599 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0de1fb87-8f71-4b57-af90-3568d238da35" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.410815 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0de1fb87-8f71-4b57-af90-3568d238da35" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.411515 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.416058 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.416323 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.416668 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.416969 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.426282 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt"] Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.510099 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.510156 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26pm\" (UniqueName: \"kubernetes.io/projected/0fc7b172-694d-4880-a68f-15ba2460d816-kube-api-access-r26pm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.510195 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.510215 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.611823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.612101 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26pm\" (UniqueName: \"kubernetes.io/projected/0fc7b172-694d-4880-a68f-15ba2460d816-kube-api-access-r26pm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.612136 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.612156 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.620006 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.621748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.623332 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.629249 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26pm\" (UniqueName: \"kubernetes.io/projected/0fc7b172-694d-4880-a68f-15ba2460d816-kube-api-access-r26pm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:55 crc kubenswrapper[4907]: I1009 19:50:55.736897 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:50:56 crc kubenswrapper[4907]: I1009 19:50:56.331862 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt"] Oct 09 19:50:57 crc kubenswrapper[4907]: I1009 19:50:57.313826 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" event={"ID":"0fc7b172-694d-4880-a68f-15ba2460d816","Type":"ContainerStarted","Data":"56878e08b04acf3e0eba77a6ff29c228a4bfc75aac217b800745eca079022e18"} Oct 09 19:50:57 crc kubenswrapper[4907]: I1009 19:50:57.314112 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" event={"ID":"0fc7b172-694d-4880-a68f-15ba2460d816","Type":"ContainerStarted","Data":"b687d4066d207fd8b4a689cb1a7d438e4ec45ad6bdfd9e98df5ff13353aa7631"} Oct 09 19:50:57 crc kubenswrapper[4907]: I1009 19:50:57.338579 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" podStartSLOduration=1.817964174 podStartE2EDuration="2.33855185s" podCreationTimestamp="2025-10-09 19:50:55 +0000 UTC" firstStartedPulling="2025-10-09 19:50:56.333417075 +0000 UTC m=+1341.865384574" lastFinishedPulling="2025-10-09 19:50:56.854004741 +0000 UTC m=+1342.385972250" observedRunningTime="2025-10-09 19:50:57.329030379 +0000 UTC m=+1342.860997898" watchObservedRunningTime="2025-10-09 19:50:57.33855185 +0000 UTC m=+1342.870519349" Oct 09 19:51:06 crc kubenswrapper[4907]: I1009 19:51:06.299340 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:51:06 crc kubenswrapper[4907]: I1009 19:51:06.299838 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:51:06 crc kubenswrapper[4907]: I1009 19:51:06.299889 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:51:06 crc kubenswrapper[4907]: I1009 19:51:06.300704 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84d17c3295a6b716a77384873892c694131b36c566b6c05af89285bf8e725573"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 19:51:06 crc kubenswrapper[4907]: I1009 19:51:06.300779 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://84d17c3295a6b716a77384873892c694131b36c566b6c05af89285bf8e725573" gracePeriod=600 Oct 09 19:51:07 crc kubenswrapper[4907]: I1009 19:51:07.421346 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="84d17c3295a6b716a77384873892c694131b36c566b6c05af89285bf8e725573" exitCode=0 Oct 09 19:51:07 crc kubenswrapper[4907]: I1009 19:51:07.421420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"84d17c3295a6b716a77384873892c694131b36c566b6c05af89285bf8e725573"} Oct 09 19:51:07 crc kubenswrapper[4907]: I1009 19:51:07.421965 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879"} Oct 09 19:51:07 crc kubenswrapper[4907]: I1009 19:51:07.421991 4907 scope.go:117] "RemoveContainer" containerID="9461f8fa6da50e0e37d8d2c88aee594214386d7c074bf0b7db5d5d79f7d078a8" Oct 09 19:51:37 crc kubenswrapper[4907]: I1009 19:51:37.550522 4907 scope.go:117] "RemoveContainer" containerID="d8787c13c73cb2c1a48b68c546c8a1e8bd101a32ebeaf3f4e781b61df9d21a10" Oct 09 19:51:37 crc kubenswrapper[4907]: I1009 19:51:37.588398 4907 scope.go:117] "RemoveContainer" containerID="4bb03cbd245283b2571c1bdc5bad5e7f5d6c328f8131dea31949fe6744c6042f" Oct 09 19:51:37 crc kubenswrapper[4907]: I1009 19:51:37.667570 4907 scope.go:117] "RemoveContainer" containerID="d9379f8fa08d8ac3eec835d033c321b7835bb49c3641da5df4a678ca4cad124d" Oct 09 19:52:37 crc kubenswrapper[4907]: I1009 19:52:37.814692 4907 scope.go:117] "RemoveContainer" containerID="2cf0bd7a46306bfaba9d50f007ae22c87fc1243f5071d763cbe337161004b3e6" Oct 09 19:52:37 crc kubenswrapper[4907]: I1009 19:52:37.845286 4907 scope.go:117] "RemoveContainer" containerID="0b94a508daf88e3c8c116f75b2d90521643ab1c9e427223336f6964034206582" Oct 09 19:52:37 crc kubenswrapper[4907]: I1009 19:52:37.902511 4907 scope.go:117] "RemoveContainer" containerID="be07f0262303804bfbc1cb8ea506aee01e615e12e1b136f278186de95fee87fa" Oct 09 19:52:37 crc kubenswrapper[4907]: I1009 19:52:37.939530 4907 scope.go:117] "RemoveContainer" containerID="9a8068e2680dad27b8707e21478cc6ff7f3ba7ed24ad8f47e22cfd3d410c8e80" Oct 09 19:53:06 crc kubenswrapper[4907]: I1009 19:53:06.300001 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:53:06 crc kubenswrapper[4907]: I1009 19:53:06.300784 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:53:36 crc kubenswrapper[4907]: I1009 19:53:36.299830 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:53:36 crc kubenswrapper[4907]: I1009 19:53:36.300347 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:54:01 crc kubenswrapper[4907]: I1009 19:54:01.332170 4907 generic.go:334] "Generic (PLEG): container finished" podID="0fc7b172-694d-4880-a68f-15ba2460d816" containerID="56878e08b04acf3e0eba77a6ff29c228a4bfc75aac217b800745eca079022e18" exitCode=0 Oct 09 19:54:01 crc kubenswrapper[4907]: I1009 19:54:01.332260 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" event={"ID":"0fc7b172-694d-4880-a68f-15ba2460d816","Type":"ContainerDied","Data":"56878e08b04acf3e0eba77a6ff29c228a4bfc75aac217b800745eca079022e18"} Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.728061 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.843045 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-inventory\") pod \"0fc7b172-694d-4880-a68f-15ba2460d816\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.843118 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-bootstrap-combined-ca-bundle\") pod \"0fc7b172-694d-4880-a68f-15ba2460d816\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.843143 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-ssh-key\") pod \"0fc7b172-694d-4880-a68f-15ba2460d816\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.843192 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r26pm\" (UniqueName: \"kubernetes.io/projected/0fc7b172-694d-4880-a68f-15ba2460d816-kube-api-access-r26pm\") pod \"0fc7b172-694d-4880-a68f-15ba2460d816\" (UID: \"0fc7b172-694d-4880-a68f-15ba2460d816\") " Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.851954 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc7b172-694d-4880-a68f-15ba2460d816-kube-api-access-r26pm" (OuterVolumeSpecName: "kube-api-access-r26pm") pod "0fc7b172-694d-4880-a68f-15ba2460d816" (UID: "0fc7b172-694d-4880-a68f-15ba2460d816"). InnerVolumeSpecName "kube-api-access-r26pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.854139 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0fc7b172-694d-4880-a68f-15ba2460d816" (UID: "0fc7b172-694d-4880-a68f-15ba2460d816"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.883561 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-inventory" (OuterVolumeSpecName: "inventory") pod "0fc7b172-694d-4880-a68f-15ba2460d816" (UID: "0fc7b172-694d-4880-a68f-15ba2460d816"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.885179 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0fc7b172-694d-4880-a68f-15ba2460d816" (UID: "0fc7b172-694d-4880-a68f-15ba2460d816"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.945209 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.945558 4907 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.945570 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fc7b172-694d-4880-a68f-15ba2460d816-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:54:02 crc kubenswrapper[4907]: I1009 19:54:02.945578 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r26pm\" (UniqueName: \"kubernetes.io/projected/0fc7b172-694d-4880-a68f-15ba2460d816-kube-api-access-r26pm\") on node \"crc\" DevicePath \"\"" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.357527 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" event={"ID":"0fc7b172-694d-4880-a68f-15ba2460d816","Type":"ContainerDied","Data":"b687d4066d207fd8b4a689cb1a7d438e4ec45ad6bdfd9e98df5ff13353aa7631"} Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.357581 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b687d4066d207fd8b4a689cb1a7d438e4ec45ad6bdfd9e98df5ff13353aa7631" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.357662 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.477302 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf"] Oct 09 19:54:03 crc kubenswrapper[4907]: E1009 19:54:03.477752 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc7b172-694d-4880-a68f-15ba2460d816" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.477772 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc7b172-694d-4880-a68f-15ba2460d816" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.477999 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc7b172-694d-4880-a68f-15ba2460d816" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.478650 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.482422 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.482942 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.483074 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.489668 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.516708 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf"] Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.662260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.662673 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxz6q\" (UniqueName: \"kubernetes.io/projected/5594394b-d72c-4541-ba69-6342110d2b3a-kube-api-access-zxz6q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.662967 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.764368 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.764586 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxz6q\" (UniqueName: \"kubernetes.io/projected/5594394b-d72c-4541-ba69-6342110d2b3a-kube-api-access-zxz6q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.764752 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.770985 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.771085 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.803884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxz6q\" (UniqueName: \"kubernetes.io/projected/5594394b-d72c-4541-ba69-6342110d2b3a-kube-api-access-zxz6q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:03 crc kubenswrapper[4907]: I1009 19:54:03.806877 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:54:04 crc kubenswrapper[4907]: I1009 19:54:04.447489 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 19:54:04 crc kubenswrapper[4907]: I1009 19:54:04.455194 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf"] Oct 09 19:54:05 crc kubenswrapper[4907]: I1009 19:54:05.381969 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" event={"ID":"5594394b-d72c-4541-ba69-6342110d2b3a","Type":"ContainerStarted","Data":"62adc6fe4f57d9e92339b4b4dacc2f2a80e12333490cb371419f34bec5608acd"} Oct 09 19:54:05 crc kubenswrapper[4907]: I1009 19:54:05.382531 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" event={"ID":"5594394b-d72c-4541-ba69-6342110d2b3a","Type":"ContainerStarted","Data":"3a38078bc418b42f879d2a484b0704d91594cfdaaf59e720811537d60da051de"} Oct 09 19:54:05 crc kubenswrapper[4907]: I1009 19:54:05.410965 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" podStartSLOduration=1.833161886 podStartE2EDuration="2.410944966s" podCreationTimestamp="2025-10-09 19:54:03 +0000 UTC" firstStartedPulling="2025-10-09 19:54:04.447208523 +0000 UTC m=+1529.979176022" lastFinishedPulling="2025-10-09 19:54:05.024991613 +0000 UTC m=+1530.556959102" observedRunningTime="2025-10-09 19:54:05.406176905 +0000 UTC m=+1530.938144454" watchObservedRunningTime="2025-10-09 19:54:05.410944966 +0000 UTC m=+1530.942912455" Oct 09 19:54:06 crc kubenswrapper[4907]: I1009 19:54:06.299118 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 19:54:06 crc kubenswrapper[4907]: I1009 19:54:06.299188 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 19:54:06 crc kubenswrapper[4907]: I1009 19:54:06.299241 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 19:54:06 crc kubenswrapper[4907]: I1009 19:54:06.299997 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 19:54:06 crc kubenswrapper[4907]: I1009 19:54:06.300059 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" gracePeriod=600 Oct 09 19:54:06 crc kubenswrapper[4907]: E1009 19:54:06.420655 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:54:07 crc kubenswrapper[4907]: I1009 19:54:07.406253 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" exitCode=0 Oct 09 19:54:07 crc kubenswrapper[4907]: I1009 19:54:07.406310 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879"} Oct 09 19:54:07 crc kubenswrapper[4907]: I1009 19:54:07.406718 4907 scope.go:117] "RemoveContainer" containerID="84d17c3295a6b716a77384873892c694131b36c566b6c05af89285bf8e725573" Oct 09 19:54:07 crc kubenswrapper[4907]: I1009 19:54:07.407425 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:54:07 crc kubenswrapper[4907]: E1009 19:54:07.407956 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:54:23 crc kubenswrapper[4907]: I1009 19:54:23.152744 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:54:23 crc kubenswrapper[4907]: E1009 19:54:23.153332 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:54:36 crc kubenswrapper[4907]: I1009 19:54:36.151600 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:54:36 crc kubenswrapper[4907]: E1009 19:54:36.152309 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:54:38 crc kubenswrapper[4907]: I1009 19:54:38.061974 4907 scope.go:117] "RemoveContainer" containerID="4a6e1fdaaedf653a99498956c11e865d2f8110f4af0a352c36bbdb16cbd750e4" Oct 09 19:54:38 crc kubenswrapper[4907]: I1009 19:54:38.098900 4907 scope.go:117] "RemoveContainer" containerID="cab012b751061895123e234f0961d089c5b413dab465ae86c1432711255f56ee" Oct 09 19:54:50 crc kubenswrapper[4907]: I1009 19:54:50.151412 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:54:50 crc kubenswrapper[4907]: E1009 19:54:50.152192 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:55:05 crc kubenswrapper[4907]: I1009 19:55:05.160404 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:55:05 crc kubenswrapper[4907]: E1009 19:55:05.161422 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:55:16 crc kubenswrapper[4907]: I1009 19:55:16.152025 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:55:16 crc kubenswrapper[4907]: E1009 19:55:16.152830 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:55:17 crc kubenswrapper[4907]: I1009 19:55:17.040940 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-txm7b"] Oct 09 19:55:17 crc kubenswrapper[4907]: I1009 19:55:17.053177 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-txm7b"] Oct 09 19:55:17 crc kubenswrapper[4907]: I1009 19:55:17.169544 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cab588e-6160-4912-8fde-e12b7f005984" path="/var/lib/kubelet/pods/3cab588e-6160-4912-8fde-e12b7f005984/volumes" Oct 09 19:55:18 crc kubenswrapper[4907]: I1009 19:55:18.036073 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-49246"] Oct 09 19:55:18 crc kubenswrapper[4907]: I1009 19:55:18.046020 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-49246"] Oct 09 19:55:19 crc kubenswrapper[4907]: I1009 19:55:19.163598 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f011c439-2d43-4c4e-b1e2-b91c4f2d77e9" path="/var/lib/kubelet/pods/f011c439-2d43-4c4e-b1e2-b91c4f2d77e9/volumes" Oct 09 19:55:23 crc kubenswrapper[4907]: I1009 19:55:23.045636 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-58csx"] Oct 09 19:55:23 crc kubenswrapper[4907]: I1009 19:55:23.055325 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-58csx"] Oct 09 19:55:23 crc kubenswrapper[4907]: I1009 19:55:23.184432 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec9187c-1e24-4ef9-aff7-3b2391d23822" path="/var/lib/kubelet/pods/4ec9187c-1e24-4ef9-aff7-3b2391d23822/volumes" Oct 09 19:55:27 crc kubenswrapper[4907]: I1009 19:55:27.930895 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x8wkh"] Oct 09 19:55:27 crc kubenswrapper[4907]: I1009 19:55:27.933443 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:27 crc kubenswrapper[4907]: I1009 19:55:27.949661 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x8wkh"] Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.028627 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b919-account-create-znlsd"] Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.038857 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b919-account-create-znlsd"] Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.042788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw6sf\" (UniqueName: \"kubernetes.io/projected/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-kube-api-access-lw6sf\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.042831 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-utilities\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.042885 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-catalog-content\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.144950 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw6sf\" (UniqueName: \"kubernetes.io/projected/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-kube-api-access-lw6sf\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.145199 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-utilities\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.145322 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-catalog-content\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.145653 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-utilities\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.145782 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-catalog-content\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.165145 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw6sf\" (UniqueName: \"kubernetes.io/projected/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-kube-api-access-lw6sf\") pod \"certified-operators-x8wkh\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.269244 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:28 crc kubenswrapper[4907]: I1009 19:55:28.832386 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x8wkh"] Oct 09 19:55:29 crc kubenswrapper[4907]: I1009 19:55:29.033238 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a117-account-create-g9ntg"] Oct 09 19:55:29 crc kubenswrapper[4907]: I1009 19:55:29.049195 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a117-account-create-g9ntg"] Oct 09 19:55:29 crc kubenswrapper[4907]: I1009 19:55:29.167590 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec0c091-4cc8-48a7-a072-fdea2cdc1125" path="/var/lib/kubelet/pods/0ec0c091-4cc8-48a7-a072-fdea2cdc1125/volumes" Oct 09 19:55:29 crc kubenswrapper[4907]: I1009 19:55:29.168380 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="371529d4-9ff3-4104-badb-d3777b626f91" path="/var/lib/kubelet/pods/371529d4-9ff3-4104-badb-d3777b626f91/volumes" Oct 09 19:55:29 crc kubenswrapper[4907]: I1009 19:55:29.204801 4907 generic.go:334] "Generic (PLEG): container finished" podID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerID="e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e" exitCode=0 Oct 09 19:55:29 crc kubenswrapper[4907]: I1009 19:55:29.204860 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8wkh" event={"ID":"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b","Type":"ContainerDied","Data":"e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e"} Oct 09 19:55:29 crc kubenswrapper[4907]: I1009 19:55:29.204897 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8wkh" event={"ID":"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b","Type":"ContainerStarted","Data":"e76645f806e912ae270bfcad163cd942060efb2d01cedb4107bfb7d84489541f"} Oct 09 19:55:31 crc kubenswrapper[4907]: I1009 19:55:31.152340 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:55:31 crc kubenswrapper[4907]: E1009 19:55:31.153845 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:55:31 crc kubenswrapper[4907]: I1009 19:55:31.245685 4907 generic.go:334] "Generic (PLEG): container finished" podID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerID="01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987" exitCode=0 Oct 09 19:55:31 crc kubenswrapper[4907]: I1009 19:55:31.245796 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8wkh" event={"ID":"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b","Type":"ContainerDied","Data":"01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987"} Oct 09 19:55:32 crc kubenswrapper[4907]: I1009 19:55:32.259267 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8wkh" event={"ID":"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b","Type":"ContainerStarted","Data":"8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa"} Oct 09 19:55:32 crc kubenswrapper[4907]: I1009 19:55:32.287287 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x8wkh" podStartSLOduration=2.561624634 podStartE2EDuration="5.28726354s" podCreationTimestamp="2025-10-09 19:55:27 +0000 UTC" firstStartedPulling="2025-10-09 19:55:29.207263543 +0000 UTC m=+1614.739231042" lastFinishedPulling="2025-10-09 19:55:31.932902409 +0000 UTC m=+1617.464869948" observedRunningTime="2025-10-09 19:55:32.278558159 +0000 UTC m=+1617.810525688" watchObservedRunningTime="2025-10-09 19:55:32.28726354 +0000 UTC m=+1617.819231059" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.696394 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-th58l"] Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.699184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.756607 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-utilities\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.756736 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9trj6\" (UniqueName: \"kubernetes.io/projected/e6353f17-3296-418a-a0de-70573d1e5597-kube-api-access-9trj6\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.756760 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-catalog-content\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.758606 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-th58l"] Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.857816 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9trj6\" (UniqueName: \"kubernetes.io/projected/e6353f17-3296-418a-a0de-70573d1e5597-kube-api-access-9trj6\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.857855 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-catalog-content\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.857919 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-utilities\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.878258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-catalog-content\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.878283 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-utilities\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:33 crc kubenswrapper[4907]: I1009 19:55:33.898597 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9trj6\" (UniqueName: \"kubernetes.io/projected/e6353f17-3296-418a-a0de-70573d1e5597-kube-api-access-9trj6\") pod \"redhat-operators-th58l\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:34 crc kubenswrapper[4907]: I1009 19:55:34.027825 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a445-account-create-8pwq4"] Oct 09 19:55:34 crc kubenswrapper[4907]: I1009 19:55:34.034153 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a445-account-create-8pwq4"] Oct 09 19:55:34 crc kubenswrapper[4907]: I1009 19:55:34.042130 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:34 crc kubenswrapper[4907]: I1009 19:55:34.506810 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-th58l"] Oct 09 19:55:35 crc kubenswrapper[4907]: I1009 19:55:35.177151 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a8729b0-9a3a-4ecf-8167-430807adc85a" path="/var/lib/kubelet/pods/9a8729b0-9a3a-4ecf-8167-430807adc85a/volumes" Oct 09 19:55:35 crc kubenswrapper[4907]: I1009 19:55:35.294966 4907 generic.go:334] "Generic (PLEG): container finished" podID="e6353f17-3296-418a-a0de-70573d1e5597" containerID="1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b" exitCode=0 Oct 09 19:55:35 crc kubenswrapper[4907]: I1009 19:55:35.295036 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th58l" event={"ID":"e6353f17-3296-418a-a0de-70573d1e5597","Type":"ContainerDied","Data":"1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b"} Oct 09 19:55:35 crc kubenswrapper[4907]: I1009 19:55:35.295109 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th58l" event={"ID":"e6353f17-3296-418a-a0de-70573d1e5597","Type":"ContainerStarted","Data":"251fa6b506793cfb30f7fedbfabb411d6c3f4d88f4ce17ec7c0dba37d9f5b91d"} Oct 09 19:55:37 crc kubenswrapper[4907]: I1009 19:55:37.322027 4907 generic.go:334] "Generic (PLEG): container finished" podID="e6353f17-3296-418a-a0de-70573d1e5597" containerID="92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87" exitCode=0 Oct 09 19:55:37 crc kubenswrapper[4907]: I1009 19:55:37.322148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th58l" event={"ID":"e6353f17-3296-418a-a0de-70573d1e5597","Type":"ContainerDied","Data":"92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87"} Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.171146 4907 scope.go:117] "RemoveContainer" containerID="120bdb60718844df677552c6740ed660025280a11f54a1995b04d679f88585f0" Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.269455 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.271803 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.327540 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.403908 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.566908 4907 scope.go:117] "RemoveContainer" containerID="079c9991cda657e601f75ab8f5bd4c8f61086072d411cea7a3eb090d2b7083c8" Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.624001 4907 scope.go:117] "RemoveContainer" containerID="af61657c7193fd70c5dce7e743e2ed58a790c5f7a97a1364df82fb9133bbdcd4" Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.695406 4907 scope.go:117] "RemoveContainer" containerID="1194eb317fc569e3644462f89cfc41d57e487cc5be070d80d3503ef05de18479" Oct 09 19:55:38 crc kubenswrapper[4907]: I1009 19:55:38.806889 4907 scope.go:117] "RemoveContainer" containerID="9d4337ef10824f1f867f477fb038d7bfb4c7be79845b4c70a00b3a7fe1f7a460" Oct 09 19:55:39 crc kubenswrapper[4907]: I1009 19:55:39.014670 4907 scope.go:117] "RemoveContainer" containerID="8610e92d2aa814db2029fb98bc4b4850506b32cf32dd841895a03390c370ce9f" Oct 09 19:55:39 crc kubenswrapper[4907]: I1009 19:55:39.497665 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x8wkh"] Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.383252 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th58l" event={"ID":"e6353f17-3296-418a-a0de-70573d1e5597","Type":"ContainerStarted","Data":"a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562"} Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.383422 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x8wkh" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerName="registry-server" containerID="cri-o://8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa" gracePeriod=2 Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.401737 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-th58l" podStartSLOduration=2.975071003 podStartE2EDuration="7.401715519s" podCreationTimestamp="2025-10-09 19:55:33 +0000 UTC" firstStartedPulling="2025-10-09 19:55:35.297480616 +0000 UTC m=+1620.829448105" lastFinishedPulling="2025-10-09 19:55:39.724125092 +0000 UTC m=+1625.256092621" observedRunningTime="2025-10-09 19:55:40.398290122 +0000 UTC m=+1625.930257651" watchObservedRunningTime="2025-10-09 19:55:40.401715519 +0000 UTC m=+1625.933683008" Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.825886 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.908340 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-catalog-content\") pod \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.908608 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw6sf\" (UniqueName: \"kubernetes.io/projected/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-kube-api-access-lw6sf\") pod \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.908666 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-utilities\") pod \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\" (UID: \"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b\") " Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.909421 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-utilities" (OuterVolumeSpecName: "utilities") pod "cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" (UID: "cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.915324 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-kube-api-access-lw6sf" (OuterVolumeSpecName: "kube-api-access-lw6sf") pod "cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" (UID: "cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b"). InnerVolumeSpecName "kube-api-access-lw6sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:55:40 crc kubenswrapper[4907]: I1009 19:55:40.948793 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" (UID: "cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.010546 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.010590 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw6sf\" (UniqueName: \"kubernetes.io/projected/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-kube-api-access-lw6sf\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.010604 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.397332 4907 generic.go:334] "Generic (PLEG): container finished" podID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerID="8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa" exitCode=0 Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.397407 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8wkh" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.397428 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8wkh" event={"ID":"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b","Type":"ContainerDied","Data":"8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa"} Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.397870 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8wkh" event={"ID":"cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b","Type":"ContainerDied","Data":"e76645f806e912ae270bfcad163cd942060efb2d01cedb4107bfb7d84489541f"} Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.397905 4907 scope.go:117] "RemoveContainer" containerID="8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.424131 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x8wkh"] Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.431404 4907 scope.go:117] "RemoveContainer" containerID="01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.433735 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x8wkh"] Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.455280 4907 scope.go:117] "RemoveContainer" containerID="e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.515708 4907 scope.go:117] "RemoveContainer" containerID="8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa" Oct 09 19:55:41 crc kubenswrapper[4907]: E1009 19:55:41.516255 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa\": container with ID starting with 8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa not found: ID does not exist" containerID="8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.516318 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa"} err="failed to get container status \"8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa\": rpc error: code = NotFound desc = could not find container \"8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa\": container with ID starting with 8fd0f52b720b9abd7958fd5699ae7e1faaaa317c51d8facf22d1ff45a6e042fa not found: ID does not exist" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.516357 4907 scope.go:117] "RemoveContainer" containerID="01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987" Oct 09 19:55:41 crc kubenswrapper[4907]: E1009 19:55:41.516945 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987\": container with ID starting with 01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987 not found: ID does not exist" containerID="01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.516979 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987"} err="failed to get container status \"01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987\": rpc error: code = NotFound desc = could not find container \"01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987\": container with ID starting with 01817048ef805635fcf8bab5e580adb6d592f1f9bc93e627d1ab9b4a0de6c987 not found: ID does not exist" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.517001 4907 scope.go:117] "RemoveContainer" containerID="e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e" Oct 09 19:55:41 crc kubenswrapper[4907]: E1009 19:55:41.517423 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e\": container with ID starting with e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e not found: ID does not exist" containerID="e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e" Oct 09 19:55:41 crc kubenswrapper[4907]: I1009 19:55:41.517481 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e"} err="failed to get container status \"e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e\": rpc error: code = NotFound desc = could not find container \"e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e\": container with ID starting with e8d0bf4f640417e935ef6c058e0e23111379a29b66af68ab89fa51e8b551097e not found: ID does not exist" Oct 09 19:55:43 crc kubenswrapper[4907]: I1009 19:55:43.152393 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:55:43 crc kubenswrapper[4907]: E1009 19:55:43.152990 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:55:43 crc kubenswrapper[4907]: I1009 19:55:43.167616 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" path="/var/lib/kubelet/pods/cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b/volumes" Oct 09 19:55:44 crc kubenswrapper[4907]: I1009 19:55:44.042688 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:44 crc kubenswrapper[4907]: I1009 19:55:44.042964 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:45 crc kubenswrapper[4907]: I1009 19:55:45.112011 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-th58l" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="registry-server" probeResult="failure" output=< Oct 09 19:55:45 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Oct 09 19:55:45 crc kubenswrapper[4907]: > Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.431372 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cw5x7"] Oct 09 19:55:46 crc kubenswrapper[4907]: E1009 19:55:46.432125 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerName="extract-utilities" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.432140 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerName="extract-utilities" Oct 09 19:55:46 crc kubenswrapper[4907]: E1009 19:55:46.432188 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerName="extract-content" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.432197 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerName="extract-content" Oct 09 19:55:46 crc kubenswrapper[4907]: E1009 19:55:46.432212 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerName="registry-server" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.432222 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerName="registry-server" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.432484 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcfe7c8-626a-45d0-834a-ff30d8a4ae0b" containerName="registry-server" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.434189 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.445436 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cw5x7"] Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.450423 4907 generic.go:334] "Generic (PLEG): container finished" podID="5594394b-d72c-4541-ba69-6342110d2b3a" containerID="62adc6fe4f57d9e92339b4b4dacc2f2a80e12333490cb371419f34bec5608acd" exitCode=0 Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.450505 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" event={"ID":"5594394b-d72c-4541-ba69-6342110d2b3a","Type":"ContainerDied","Data":"62adc6fe4f57d9e92339b4b4dacc2f2a80e12333490cb371419f34bec5608acd"} Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.525746 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-catalog-content\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.525834 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-utilities\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.525885 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfrsf\" (UniqueName: \"kubernetes.io/projected/80d36b97-5f42-48a4-93eb-56cd2962556d-kube-api-access-cfrsf\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.628116 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfrsf\" (UniqueName: \"kubernetes.io/projected/80d36b97-5f42-48a4-93eb-56cd2962556d-kube-api-access-cfrsf\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.628313 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-catalog-content\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.628383 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-utilities\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.628989 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-utilities\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.629076 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-catalog-content\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.655675 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfrsf\" (UniqueName: \"kubernetes.io/projected/80d36b97-5f42-48a4-93eb-56cd2962556d-kube-api-access-cfrsf\") pod \"community-operators-cw5x7\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:46 crc kubenswrapper[4907]: I1009 19:55:46.765458 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:47 crc kubenswrapper[4907]: I1009 19:55:47.292790 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cw5x7"] Oct 09 19:55:47 crc kubenswrapper[4907]: I1009 19:55:47.463288 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cw5x7" event={"ID":"80d36b97-5f42-48a4-93eb-56cd2962556d","Type":"ContainerStarted","Data":"3cee20adad878edc57fe3b24b5fab220a5c3f9ec235c9ab55b23d34962770abd"} Oct 09 19:55:47 crc kubenswrapper[4907]: I1009 19:55:47.849719 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:55:47 crc kubenswrapper[4907]: I1009 19:55:47.955571 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-inventory\") pod \"5594394b-d72c-4541-ba69-6342110d2b3a\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " Oct 09 19:55:47 crc kubenswrapper[4907]: I1009 19:55:47.956013 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-ssh-key\") pod \"5594394b-d72c-4541-ba69-6342110d2b3a\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " Oct 09 19:55:47 crc kubenswrapper[4907]: I1009 19:55:47.956413 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxz6q\" (UniqueName: \"kubernetes.io/projected/5594394b-d72c-4541-ba69-6342110d2b3a-kube-api-access-zxz6q\") pod \"5594394b-d72c-4541-ba69-6342110d2b3a\" (UID: \"5594394b-d72c-4541-ba69-6342110d2b3a\") " Oct 09 19:55:47 crc kubenswrapper[4907]: I1009 19:55:47.961760 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5594394b-d72c-4541-ba69-6342110d2b3a-kube-api-access-zxz6q" (OuterVolumeSpecName: "kube-api-access-zxz6q") pod "5594394b-d72c-4541-ba69-6342110d2b3a" (UID: "5594394b-d72c-4541-ba69-6342110d2b3a"). InnerVolumeSpecName "kube-api-access-zxz6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:55:47 crc kubenswrapper[4907]: I1009 19:55:47.988875 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-inventory" (OuterVolumeSpecName: "inventory") pod "5594394b-d72c-4541-ba69-6342110d2b3a" (UID: "5594394b-d72c-4541-ba69-6342110d2b3a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.006328 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5594394b-d72c-4541-ba69-6342110d2b3a" (UID: "5594394b-d72c-4541-ba69-6342110d2b3a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.060550 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxz6q\" (UniqueName: \"kubernetes.io/projected/5594394b-d72c-4541-ba69-6342110d2b3a-kube-api-access-zxz6q\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.060622 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.060648 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5594394b-d72c-4541-ba69-6342110d2b3a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.476101 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.476100 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf" event={"ID":"5594394b-d72c-4541-ba69-6342110d2b3a","Type":"ContainerDied","Data":"3a38078bc418b42f879d2a484b0704d91594cfdaaf59e720811537d60da051de"} Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.476288 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a38078bc418b42f879d2a484b0704d91594cfdaaf59e720811537d60da051de" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.479508 4907 generic.go:334] "Generic (PLEG): container finished" podID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerID="28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063" exitCode=0 Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.479549 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cw5x7" event={"ID":"80d36b97-5f42-48a4-93eb-56cd2962556d","Type":"ContainerDied","Data":"28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063"} Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.616570 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx"] Oct 09 19:55:48 crc kubenswrapper[4907]: E1009 19:55:48.617672 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5594394b-d72c-4541-ba69-6342110d2b3a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.617701 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5594394b-d72c-4541-ba69-6342110d2b3a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.618371 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5594394b-d72c-4541-ba69-6342110d2b3a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.619772 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.631207 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.631536 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.631631 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.631853 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.631965 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx"] Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.693029 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.693326 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.693478 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62b9l\" (UniqueName: \"kubernetes.io/projected/4f6c717a-ca37-4879-babe-36221d9580fa-kube-api-access-62b9l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.794890 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.795009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.795074 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62b9l\" (UniqueName: \"kubernetes.io/projected/4f6c717a-ca37-4879-babe-36221d9580fa-kube-api-access-62b9l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.802080 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.802459 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:48 crc kubenswrapper[4907]: I1009 19:55:48.816169 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62b9l\" (UniqueName: \"kubernetes.io/projected/4f6c717a-ca37-4879-babe-36221d9580fa-kube-api-access-62b9l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:49 crc kubenswrapper[4907]: I1009 19:55:49.022224 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:55:49 crc kubenswrapper[4907]: I1009 19:55:49.531539 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx"] Oct 09 19:55:49 crc kubenswrapper[4907]: W1009 19:55:49.533842 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f6c717a_ca37_4879_babe_36221d9580fa.slice/crio-961cc1472207cbc57e1adc2383d9d0d799d4836d959c1ba6bfb8ae4077175197 WatchSource:0}: Error finding container 961cc1472207cbc57e1adc2383d9d0d799d4836d959c1ba6bfb8ae4077175197: Status 404 returned error can't find the container with id 961cc1472207cbc57e1adc2383d9d0d799d4836d959c1ba6bfb8ae4077175197 Oct 09 19:55:50 crc kubenswrapper[4907]: I1009 19:55:50.500507 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" event={"ID":"4f6c717a-ca37-4879-babe-36221d9580fa","Type":"ContainerStarted","Data":"961cc1472207cbc57e1adc2383d9d0d799d4836d959c1ba6bfb8ae4077175197"} Oct 09 19:55:50 crc kubenswrapper[4907]: I1009 19:55:50.503802 4907 generic.go:334] "Generic (PLEG): container finished" podID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerID="fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9" exitCode=0 Oct 09 19:55:50 crc kubenswrapper[4907]: I1009 19:55:50.503834 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cw5x7" event={"ID":"80d36b97-5f42-48a4-93eb-56cd2962556d","Type":"ContainerDied","Data":"fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9"} Oct 09 19:55:51 crc kubenswrapper[4907]: I1009 19:55:51.518701 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cw5x7" event={"ID":"80d36b97-5f42-48a4-93eb-56cd2962556d","Type":"ContainerStarted","Data":"a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6"} Oct 09 19:55:51 crc kubenswrapper[4907]: I1009 19:55:51.522364 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" event={"ID":"4f6c717a-ca37-4879-babe-36221d9580fa","Type":"ContainerStarted","Data":"38dc5559724783de90161c5cbfd7e0a2d0765457fb3fbffa7dc7e6e7bac52478"} Oct 09 19:55:51 crc kubenswrapper[4907]: I1009 19:55:51.547405 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cw5x7" podStartSLOduration=2.823425013 podStartE2EDuration="5.547387803s" podCreationTimestamp="2025-10-09 19:55:46 +0000 UTC" firstStartedPulling="2025-10-09 19:55:48.481563792 +0000 UTC m=+1634.013531281" lastFinishedPulling="2025-10-09 19:55:51.205526582 +0000 UTC m=+1636.737494071" observedRunningTime="2025-10-09 19:55:51.540534849 +0000 UTC m=+1637.072502348" watchObservedRunningTime="2025-10-09 19:55:51.547387803 +0000 UTC m=+1637.079355292" Oct 09 19:55:51 crc kubenswrapper[4907]: I1009 19:55:51.564932 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" podStartSLOduration=2.655023732 podStartE2EDuration="3.564912466s" podCreationTimestamp="2025-10-09 19:55:48 +0000 UTC" firstStartedPulling="2025-10-09 19:55:49.53623057 +0000 UTC m=+1635.068198099" lastFinishedPulling="2025-10-09 19:55:50.446119344 +0000 UTC m=+1635.978086833" observedRunningTime="2025-10-09 19:55:51.560962096 +0000 UTC m=+1637.092929585" watchObservedRunningTime="2025-10-09 19:55:51.564912466 +0000 UTC m=+1637.096879955" Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.040323 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7dcmv"] Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.053376 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zn9nz"] Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.061222 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-m7fvd"] Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.068811 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zn9nz"] Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.078563 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7dcmv"] Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.082604 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-m7fvd"] Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.092629 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.141164 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:54 crc kubenswrapper[4907]: I1009 19:55:54.328646 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-th58l"] Oct 09 19:55:55 crc kubenswrapper[4907]: I1009 19:55:55.165975 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913f8dda-cdec-4099-9f66-8046eeab3371" path="/var/lib/kubelet/pods/913f8dda-cdec-4099-9f66-8046eeab3371/volumes" Oct 09 19:55:55 crc kubenswrapper[4907]: I1009 19:55:55.166596 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3" path="/var/lib/kubelet/pods/9dcfc9a5-d565-49bc-ac06-f6ba77d6d9b3/volumes" Oct 09 19:55:55 crc kubenswrapper[4907]: I1009 19:55:55.167150 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed98c56-4150-49f3-b183-440cd6fabc12" path="/var/lib/kubelet/pods/9ed98c56-4150-49f3-b183-440cd6fabc12/volumes" Oct 09 19:55:55 crc kubenswrapper[4907]: I1009 19:55:55.565088 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-th58l" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="registry-server" containerID="cri-o://a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562" gracePeriod=2 Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.037499 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.151413 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:55:56 crc kubenswrapper[4907]: E1009 19:55:56.151772 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.173344 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-utilities\") pod \"e6353f17-3296-418a-a0de-70573d1e5597\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.173411 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-catalog-content\") pod \"e6353f17-3296-418a-a0de-70573d1e5597\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.173543 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9trj6\" (UniqueName: \"kubernetes.io/projected/e6353f17-3296-418a-a0de-70573d1e5597-kube-api-access-9trj6\") pod \"e6353f17-3296-418a-a0de-70573d1e5597\" (UID: \"e6353f17-3296-418a-a0de-70573d1e5597\") " Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.175041 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-utilities" (OuterVolumeSpecName: "utilities") pod "e6353f17-3296-418a-a0de-70573d1e5597" (UID: "e6353f17-3296-418a-a0de-70573d1e5597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.178716 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6353f17-3296-418a-a0de-70573d1e5597-kube-api-access-9trj6" (OuterVolumeSpecName: "kube-api-access-9trj6") pod "e6353f17-3296-418a-a0de-70573d1e5597" (UID: "e6353f17-3296-418a-a0de-70573d1e5597"). InnerVolumeSpecName "kube-api-access-9trj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.254702 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6353f17-3296-418a-a0de-70573d1e5597" (UID: "e6353f17-3296-418a-a0de-70573d1e5597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.290414 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9trj6\" (UniqueName: \"kubernetes.io/projected/e6353f17-3296-418a-a0de-70573d1e5597-kube-api-access-9trj6\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.290473 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.290488 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6353f17-3296-418a-a0de-70573d1e5597-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.579203 4907 generic.go:334] "Generic (PLEG): container finished" podID="e6353f17-3296-418a-a0de-70573d1e5597" containerID="a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562" exitCode=0 Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.579289 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th58l" event={"ID":"e6353f17-3296-418a-a0de-70573d1e5597","Type":"ContainerDied","Data":"a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562"} Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.579322 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-th58l" event={"ID":"e6353f17-3296-418a-a0de-70573d1e5597","Type":"ContainerDied","Data":"251fa6b506793cfb30f7fedbfabb411d6c3f4d88f4ce17ec7c0dba37d9f5b91d"} Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.579343 4907 scope.go:117] "RemoveContainer" containerID="a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.579343 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-th58l" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.603373 4907 scope.go:117] "RemoveContainer" containerID="92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.629088 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-th58l"] Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.644671 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-th58l"] Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.649362 4907 scope.go:117] "RemoveContainer" containerID="1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.674298 4907 scope.go:117] "RemoveContainer" containerID="a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562" Oct 09 19:55:56 crc kubenswrapper[4907]: E1009 19:55:56.674863 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562\": container with ID starting with a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562 not found: ID does not exist" containerID="a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.674931 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562"} err="failed to get container status \"a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562\": rpc error: code = NotFound desc = could not find container \"a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562\": container with ID starting with a2372d285827bccda560d40ff11145a2a70691b3e4fab2032fd7385fc93d4562 not found: ID does not exist" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.674962 4907 scope.go:117] "RemoveContainer" containerID="92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87" Oct 09 19:55:56 crc kubenswrapper[4907]: E1009 19:55:56.675563 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87\": container with ID starting with 92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87 not found: ID does not exist" containerID="92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.675593 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87"} err="failed to get container status \"92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87\": rpc error: code = NotFound desc = could not find container \"92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87\": container with ID starting with 92ca5dc3702ee38d7a160083c56bd47be91cf213c69e28be49dd18d7bfcc5e87 not found: ID does not exist" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.675607 4907 scope.go:117] "RemoveContainer" containerID="1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b" Oct 09 19:55:56 crc kubenswrapper[4907]: E1009 19:55:56.675924 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b\": container with ID starting with 1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b not found: ID does not exist" containerID="1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.675960 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b"} err="failed to get container status \"1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b\": rpc error: code = NotFound desc = could not find container \"1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b\": container with ID starting with 1eed9478ba759baf157205a59e4d478280c2e8af1015d249664b97477a7ebb0b not found: ID does not exist" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.765716 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.766682 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:56 crc kubenswrapper[4907]: I1009 19:55:56.814243 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:57 crc kubenswrapper[4907]: I1009 19:55:57.031444 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-64pxz"] Oct 09 19:55:57 crc kubenswrapper[4907]: I1009 19:55:57.041683 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-64pxz"] Oct 09 19:55:57 crc kubenswrapper[4907]: I1009 19:55:57.163851 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4638ed26-9fde-4eca-acc3-9d292f50e4bb" path="/var/lib/kubelet/pods/4638ed26-9fde-4eca-acc3-9d292f50e4bb/volumes" Oct 09 19:55:57 crc kubenswrapper[4907]: I1009 19:55:57.164858 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6353f17-3296-418a-a0de-70573d1e5597" path="/var/lib/kubelet/pods/e6353f17-3296-418a-a0de-70573d1e5597/volumes" Oct 09 19:55:57 crc kubenswrapper[4907]: I1009 19:55:57.633103 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:55:59 crc kubenswrapper[4907]: I1009 19:55:59.045535 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-5tzcp"] Oct 09 19:55:59 crc kubenswrapper[4907]: I1009 19:55:59.057097 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-5tzcp"] Oct 09 19:55:59 crc kubenswrapper[4907]: I1009 19:55:59.131273 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cw5x7"] Oct 09 19:55:59 crc kubenswrapper[4907]: I1009 19:55:59.190910 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed1546f-98a4-4b57-b79e-8defa04c38b8" path="/var/lib/kubelet/pods/5ed1546f-98a4-4b57-b79e-8defa04c38b8/volumes" Oct 09 19:56:00 crc kubenswrapper[4907]: I1009 19:56:00.624622 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cw5x7" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerName="registry-server" containerID="cri-o://a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6" gracePeriod=2 Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.604202 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.649408 4907 generic.go:334] "Generic (PLEG): container finished" podID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerID="a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6" exitCode=0 Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.649484 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cw5x7" event={"ID":"80d36b97-5f42-48a4-93eb-56cd2962556d","Type":"ContainerDied","Data":"a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6"} Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.649521 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cw5x7" event={"ID":"80d36b97-5f42-48a4-93eb-56cd2962556d","Type":"ContainerDied","Data":"3cee20adad878edc57fe3b24b5fab220a5c3f9ec235c9ab55b23d34962770abd"} Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.649523 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cw5x7" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.649545 4907 scope.go:117] "RemoveContainer" containerID="a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.675269 4907 scope.go:117] "RemoveContainer" containerID="fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.700146 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-utilities\") pod \"80d36b97-5f42-48a4-93eb-56cd2962556d\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.700220 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-catalog-content\") pod \"80d36b97-5f42-48a4-93eb-56cd2962556d\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.700543 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfrsf\" (UniqueName: \"kubernetes.io/projected/80d36b97-5f42-48a4-93eb-56cd2962556d-kube-api-access-cfrsf\") pod \"80d36b97-5f42-48a4-93eb-56cd2962556d\" (UID: \"80d36b97-5f42-48a4-93eb-56cd2962556d\") " Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.701316 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-utilities" (OuterVolumeSpecName: "utilities") pod "80d36b97-5f42-48a4-93eb-56cd2962556d" (UID: "80d36b97-5f42-48a4-93eb-56cd2962556d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.705710 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d36b97-5f42-48a4-93eb-56cd2962556d-kube-api-access-cfrsf" (OuterVolumeSpecName: "kube-api-access-cfrsf") pod "80d36b97-5f42-48a4-93eb-56cd2962556d" (UID: "80d36b97-5f42-48a4-93eb-56cd2962556d"). InnerVolumeSpecName "kube-api-access-cfrsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.708971 4907 scope.go:117] "RemoveContainer" containerID="28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.754833 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80d36b97-5f42-48a4-93eb-56cd2962556d" (UID: "80d36b97-5f42-48a4-93eb-56cd2962556d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.790425 4907 scope.go:117] "RemoveContainer" containerID="a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6" Oct 09 19:56:01 crc kubenswrapper[4907]: E1009 19:56:01.791077 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6\": container with ID starting with a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6 not found: ID does not exist" containerID="a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.791115 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6"} err="failed to get container status \"a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6\": rpc error: code = NotFound desc = could not find container \"a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6\": container with ID starting with a24dca9317d464017fb0c85bfcfc69c9679b89fced02124b35d14a9dbc1fa8f6 not found: ID does not exist" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.791138 4907 scope.go:117] "RemoveContainer" containerID="fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9" Oct 09 19:56:01 crc kubenswrapper[4907]: E1009 19:56:01.791526 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9\": container with ID starting with fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9 not found: ID does not exist" containerID="fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.791550 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9"} err="failed to get container status \"fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9\": rpc error: code = NotFound desc = could not find container \"fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9\": container with ID starting with fd8bdcfaf26ff8a5e8ecc9352f6e57db946591db980c031c3db29bc7331ee9c9 not found: ID does not exist" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.791583 4907 scope.go:117] "RemoveContainer" containerID="28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063" Oct 09 19:56:01 crc kubenswrapper[4907]: E1009 19:56:01.792033 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063\": container with ID starting with 28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063 not found: ID does not exist" containerID="28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.792062 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063"} err="failed to get container status \"28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063\": rpc error: code = NotFound desc = could not find container \"28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063\": container with ID starting with 28158309fc993ddee84df5fa71d33d650f65fe04ccbec6f5128251afeca23063 not found: ID does not exist" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.802779 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.802806 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80d36b97-5f42-48a4-93eb-56cd2962556d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:56:01 crc kubenswrapper[4907]: I1009 19:56:01.802820 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfrsf\" (UniqueName: \"kubernetes.io/projected/80d36b97-5f42-48a4-93eb-56cd2962556d-kube-api-access-cfrsf\") on node \"crc\" DevicePath \"\"" Oct 09 19:56:02 crc kubenswrapper[4907]: I1009 19:56:02.002201 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cw5x7"] Oct 09 19:56:02 crc kubenswrapper[4907]: I1009 19:56:02.012774 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cw5x7"] Oct 09 19:56:03 crc kubenswrapper[4907]: I1009 19:56:03.170506 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" path="/var/lib/kubelet/pods/80d36b97-5f42-48a4-93eb-56cd2962556d/volumes" Oct 09 19:56:07 crc kubenswrapper[4907]: I1009 19:56:07.031671 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e8ad-account-create-65dcs"] Oct 09 19:56:07 crc kubenswrapper[4907]: I1009 19:56:07.042101 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e8ad-account-create-65dcs"] Oct 09 19:56:07 crc kubenswrapper[4907]: I1009 19:56:07.151146 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:56:07 crc kubenswrapper[4907]: E1009 19:56:07.151398 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:56:07 crc kubenswrapper[4907]: I1009 19:56:07.164017 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ff2dda-76aa-4595-bd0c-e69456106650" path="/var/lib/kubelet/pods/c1ff2dda-76aa-4595-bd0c-e69456106650/volumes" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.666827 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pft4f"] Oct 09 19:56:09 crc kubenswrapper[4907]: E1009 19:56:09.667597 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="extract-content" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.667639 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="extract-content" Oct 09 19:56:09 crc kubenswrapper[4907]: E1009 19:56:09.667661 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="extract-utilities" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.667667 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="extract-utilities" Oct 09 19:56:09 crc kubenswrapper[4907]: E1009 19:56:09.668608 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerName="extract-utilities" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.668618 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerName="extract-utilities" Oct 09 19:56:09 crc kubenswrapper[4907]: E1009 19:56:09.668632 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="registry-server" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.668638 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="registry-server" Oct 09 19:56:09 crc kubenswrapper[4907]: E1009 19:56:09.668649 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerName="extract-content" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.668656 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerName="extract-content" Oct 09 19:56:09 crc kubenswrapper[4907]: E1009 19:56:09.668672 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerName="registry-server" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.668677 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerName="registry-server" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.668850 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d36b97-5f42-48a4-93eb-56cd2962556d" containerName="registry-server" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.668865 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6353f17-3296-418a-a0de-70573d1e5597" containerName="registry-server" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.670357 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.681919 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pft4f"] Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.770921 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-utilities\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.771041 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-catalog-content\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.771072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcw5k\" (UniqueName: \"kubernetes.io/projected/ca6be656-e82a-4901-a882-eea679443872-kube-api-access-lcw5k\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.873773 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-catalog-content\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.874224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcw5k\" (UniqueName: \"kubernetes.io/projected/ca6be656-e82a-4901-a882-eea679443872-kube-api-access-lcw5k\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.874298 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-catalog-content\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.874494 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-utilities\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.875082 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-utilities\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:09 crc kubenswrapper[4907]: I1009 19:56:09.894753 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcw5k\" (UniqueName: \"kubernetes.io/projected/ca6be656-e82a-4901-a882-eea679443872-kube-api-access-lcw5k\") pod \"redhat-marketplace-pft4f\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:10 crc kubenswrapper[4907]: I1009 19:56:10.031410 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2d90-account-create-cwgs4"] Oct 09 19:56:10 crc kubenswrapper[4907]: I1009 19:56:10.039562 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-aecc-account-create-9lqqk"] Oct 09 19:56:10 crc kubenswrapper[4907]: I1009 19:56:10.044060 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:10 crc kubenswrapper[4907]: I1009 19:56:10.047500 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2d90-account-create-cwgs4"] Oct 09 19:56:10 crc kubenswrapper[4907]: I1009 19:56:10.053772 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-aecc-account-create-9lqqk"] Oct 09 19:56:10 crc kubenswrapper[4907]: I1009 19:56:10.521068 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pft4f"] Oct 09 19:56:10 crc kubenswrapper[4907]: I1009 19:56:10.756461 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pft4f" event={"ID":"ca6be656-e82a-4901-a882-eea679443872","Type":"ContainerStarted","Data":"b43145dae499a556d305df9289fc3eaa7e12e5d97333112327f85c09e2995f11"} Oct 09 19:56:11 crc kubenswrapper[4907]: I1009 19:56:11.163636 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="562823d8-4ea2-43cd-8e94-5c1c154de988" path="/var/lib/kubelet/pods/562823d8-4ea2-43cd-8e94-5c1c154de988/volumes" Oct 09 19:56:11 crc kubenswrapper[4907]: I1009 19:56:11.165040 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a28eaa-918c-4756-9936-1d724410617c" path="/var/lib/kubelet/pods/f8a28eaa-918c-4756-9936-1d724410617c/volumes" Oct 09 19:56:11 crc kubenswrapper[4907]: I1009 19:56:11.771396 4907 generic.go:334] "Generic (PLEG): container finished" podID="ca6be656-e82a-4901-a882-eea679443872" containerID="45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41" exitCode=0 Oct 09 19:56:11 crc kubenswrapper[4907]: I1009 19:56:11.771561 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pft4f" event={"ID":"ca6be656-e82a-4901-a882-eea679443872","Type":"ContainerDied","Data":"45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41"} Oct 09 19:56:12 crc kubenswrapper[4907]: I1009 19:56:12.785256 4907 generic.go:334] "Generic (PLEG): container finished" podID="ca6be656-e82a-4901-a882-eea679443872" containerID="2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945" exitCode=0 Oct 09 19:56:12 crc kubenswrapper[4907]: I1009 19:56:12.785313 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pft4f" event={"ID":"ca6be656-e82a-4901-a882-eea679443872","Type":"ContainerDied","Data":"2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945"} Oct 09 19:56:13 crc kubenswrapper[4907]: I1009 19:56:13.798303 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pft4f" event={"ID":"ca6be656-e82a-4901-a882-eea679443872","Type":"ContainerStarted","Data":"d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a"} Oct 09 19:56:13 crc kubenswrapper[4907]: I1009 19:56:13.830099 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pft4f" podStartSLOduration=3.410085239 podStartE2EDuration="4.830080841s" podCreationTimestamp="2025-10-09 19:56:09 +0000 UTC" firstStartedPulling="2025-10-09 19:56:11.775258154 +0000 UTC m=+1657.307225653" lastFinishedPulling="2025-10-09 19:56:13.195253776 +0000 UTC m=+1658.727221255" observedRunningTime="2025-10-09 19:56:13.821776981 +0000 UTC m=+1659.353744510" watchObservedRunningTime="2025-10-09 19:56:13.830080841 +0000 UTC m=+1659.362048330" Oct 09 19:56:14 crc kubenswrapper[4907]: I1009 19:56:14.029834 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-5j4pr"] Oct 09 19:56:14 crc kubenswrapper[4907]: I1009 19:56:14.038264 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-5j4pr"] Oct 09 19:56:15 crc kubenswrapper[4907]: I1009 19:56:15.170993 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b99c09-f14d-462f-a4fa-e2555b429611" path="/var/lib/kubelet/pods/39b99c09-f14d-462f-a4fa-e2555b429611/volumes" Oct 09 19:56:19 crc kubenswrapper[4907]: I1009 19:56:19.151755 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:56:19 crc kubenswrapper[4907]: E1009 19:56:19.152802 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:56:20 crc kubenswrapper[4907]: I1009 19:56:20.044232 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:20 crc kubenswrapper[4907]: I1009 19:56:20.044988 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:20 crc kubenswrapper[4907]: I1009 19:56:20.117846 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:20 crc kubenswrapper[4907]: I1009 19:56:20.948848 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:21 crc kubenswrapper[4907]: I1009 19:56:21.014332 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pft4f"] Oct 09 19:56:22 crc kubenswrapper[4907]: I1009 19:56:22.918888 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pft4f" podUID="ca6be656-e82a-4901-a882-eea679443872" containerName="registry-server" containerID="cri-o://d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a" gracePeriod=2 Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.054063 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nw9fr"] Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.066575 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nw9fr"] Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.166326 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3946131f-dcc0-4dd6-bd57-2e7afbebeb78" path="/var/lib/kubelet/pods/3946131f-dcc0-4dd6-bd57-2e7afbebeb78/volumes" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.363773 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.457217 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-utilities\") pod \"ca6be656-e82a-4901-a882-eea679443872\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.457328 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-catalog-content\") pod \"ca6be656-e82a-4901-a882-eea679443872\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.457681 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcw5k\" (UniqueName: \"kubernetes.io/projected/ca6be656-e82a-4901-a882-eea679443872-kube-api-access-lcw5k\") pod \"ca6be656-e82a-4901-a882-eea679443872\" (UID: \"ca6be656-e82a-4901-a882-eea679443872\") " Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.458574 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-utilities" (OuterVolumeSpecName: "utilities") pod "ca6be656-e82a-4901-a882-eea679443872" (UID: "ca6be656-e82a-4901-a882-eea679443872"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.463974 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6be656-e82a-4901-a882-eea679443872-kube-api-access-lcw5k" (OuterVolumeSpecName: "kube-api-access-lcw5k") pod "ca6be656-e82a-4901-a882-eea679443872" (UID: "ca6be656-e82a-4901-a882-eea679443872"). InnerVolumeSpecName "kube-api-access-lcw5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.470563 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca6be656-e82a-4901-a882-eea679443872" (UID: "ca6be656-e82a-4901-a882-eea679443872"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.560030 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcw5k\" (UniqueName: \"kubernetes.io/projected/ca6be656-e82a-4901-a882-eea679443872-kube-api-access-lcw5k\") on node \"crc\" DevicePath \"\"" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.560174 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.560186 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6be656-e82a-4901-a882-eea679443872-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.935614 4907 generic.go:334] "Generic (PLEG): container finished" podID="ca6be656-e82a-4901-a882-eea679443872" containerID="d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a" exitCode=0 Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.935667 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pft4f" event={"ID":"ca6be656-e82a-4901-a882-eea679443872","Type":"ContainerDied","Data":"d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a"} Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.935707 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pft4f" event={"ID":"ca6be656-e82a-4901-a882-eea679443872","Type":"ContainerDied","Data":"b43145dae499a556d305df9289fc3eaa7e12e5d97333112327f85c09e2995f11"} Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.935729 4907 scope.go:117] "RemoveContainer" containerID="d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.936625 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pft4f" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.969940 4907 scope.go:117] "RemoveContainer" containerID="2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945" Oct 09 19:56:23 crc kubenswrapper[4907]: I1009 19:56:23.986925 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pft4f"] Oct 09 19:56:24 crc kubenswrapper[4907]: I1009 19:56:24.000555 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pft4f"] Oct 09 19:56:24 crc kubenswrapper[4907]: I1009 19:56:24.017068 4907 scope.go:117] "RemoveContainer" containerID="45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41" Oct 09 19:56:24 crc kubenswrapper[4907]: I1009 19:56:24.037544 4907 scope.go:117] "RemoveContainer" containerID="d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a" Oct 09 19:56:24 crc kubenswrapper[4907]: E1009 19:56:24.038013 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a\": container with ID starting with d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a not found: ID does not exist" containerID="d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a" Oct 09 19:56:24 crc kubenswrapper[4907]: I1009 19:56:24.038050 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a"} err="failed to get container status \"d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a\": rpc error: code = NotFound desc = could not find container \"d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a\": container with ID starting with d6948c02f9343fe6346330597f6f6f963e6caa72c7cd6d36bb7ef90858c1955a not found: ID does not exist" Oct 09 19:56:24 crc kubenswrapper[4907]: I1009 19:56:24.038077 4907 scope.go:117] "RemoveContainer" containerID="2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945" Oct 09 19:56:24 crc kubenswrapper[4907]: E1009 19:56:24.038398 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945\": container with ID starting with 2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945 not found: ID does not exist" containerID="2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945" Oct 09 19:56:24 crc kubenswrapper[4907]: I1009 19:56:24.038446 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945"} err="failed to get container status \"2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945\": rpc error: code = NotFound desc = could not find container \"2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945\": container with ID starting with 2a92c1e9dcdf111356d63a88738417812a836f5d79a7fb4abd7f38bec6e3a945 not found: ID does not exist" Oct 09 19:56:24 crc kubenswrapper[4907]: I1009 19:56:24.038724 4907 scope.go:117] "RemoveContainer" containerID="45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41" Oct 09 19:56:24 crc kubenswrapper[4907]: E1009 19:56:24.039039 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41\": container with ID starting with 45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41 not found: ID does not exist" containerID="45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41" Oct 09 19:56:24 crc kubenswrapper[4907]: I1009 19:56:24.039099 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41"} err="failed to get container status \"45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41\": rpc error: code = NotFound desc = could not find container \"45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41\": container with ID starting with 45eda2017575265abc5dad567e758b1c1d8eaa06ebc0769262f6788cd86eba41 not found: ID does not exist" Oct 09 19:56:25 crc kubenswrapper[4907]: I1009 19:56:25.174259 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca6be656-e82a-4901-a882-eea679443872" path="/var/lib/kubelet/pods/ca6be656-e82a-4901-a882-eea679443872/volumes" Oct 09 19:56:30 crc kubenswrapper[4907]: I1009 19:56:30.152252 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:56:30 crc kubenswrapper[4907]: E1009 19:56:30.153685 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:56:36 crc kubenswrapper[4907]: I1009 19:56:36.055695 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-p2jqv"] Oct 09 19:56:36 crc kubenswrapper[4907]: I1009 19:56:36.065644 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-p2jqv"] Oct 09 19:56:37 crc kubenswrapper[4907]: I1009 19:56:37.165194 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515d2cd8-8594-45c9-82c4-2605da80d58c" path="/var/lib/kubelet/pods/515d2cd8-8594-45c9-82c4-2605da80d58c/volumes" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.531323 4907 scope.go:117] "RemoveContainer" containerID="9083c9c99d7b63e52c8565a521f8f36b93e578a16283f6084296af5850817f5c" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.579846 4907 scope.go:117] "RemoveContainer" containerID="7a3ecf55a431fb96c7ac1f01905ed2d621dfee8f504d89a6ed9aaa4428e541ac" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.648648 4907 scope.go:117] "RemoveContainer" containerID="760887aa94cf5066bb78783603b9abf2f7ed6cbd438ed798ca3984f1d9180a14" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.687054 4907 scope.go:117] "RemoveContainer" containerID="c0f9ea1cb6a858298a5179a87d72088806c5152b4153fc9f3bde683e86fe5a68" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.751823 4907 scope.go:117] "RemoveContainer" containerID="3d2497550726bb028e816d349d5575927794a70e34ab605ac02a5b32dac95693" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.788728 4907 scope.go:117] "RemoveContainer" containerID="16914d10778ef5739d4802a16fedae69c8c022532bc0f32a8b69e2fc376e7d85" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.821108 4907 scope.go:117] "RemoveContainer" containerID="ced21043db6f923ceeb9bb46fdae38b50a5d69c32520c2b93ac9f7bebfba109f" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.881952 4907 scope.go:117] "RemoveContainer" containerID="3b774ec6b2281dff36595af1c8af3289f76892b0f6fccc9b4775aff5587fe039" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.911043 4907 scope.go:117] "RemoveContainer" containerID="a7dd8c996028ff73a870a01b36ef0c572634580c194745dfc74f404cef0c720d" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.942816 4907 scope.go:117] "RemoveContainer" containerID="78b7cecd01ea23f7eb702bcc8518136d4a6050ced256d01f68b1eb33bd68e93c" Oct 09 19:56:39 crc kubenswrapper[4907]: I1009 19:56:39.971005 4907 scope.go:117] "RemoveContainer" containerID="3d8ccba5436a16d2b56612459b17edddd88d5c50b2e5ffc3573c9b65ed57bb49" Oct 09 19:56:42 crc kubenswrapper[4907]: I1009 19:56:42.053097 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8wdm6"] Oct 09 19:56:42 crc kubenswrapper[4907]: I1009 19:56:42.071330 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8wdm6"] Oct 09 19:56:42 crc kubenswrapper[4907]: I1009 19:56:42.152450 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:56:42 crc kubenswrapper[4907]: E1009 19:56:42.153298 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:56:43 crc kubenswrapper[4907]: I1009 19:56:43.162253 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="574e169c-edb6-446d-be0e-7075ec99ebb1" path="/var/lib/kubelet/pods/574e169c-edb6-446d-be0e-7075ec99ebb1/volumes" Oct 09 19:56:53 crc kubenswrapper[4907]: I1009 19:56:53.151820 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:56:53 crc kubenswrapper[4907]: E1009 19:56:53.153225 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:56:56 crc kubenswrapper[4907]: I1009 19:56:56.062952 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-8h99z"] Oct 09 19:56:56 crc kubenswrapper[4907]: I1009 19:56:56.079189 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-8h99z"] Oct 09 19:56:57 crc kubenswrapper[4907]: I1009 19:56:57.166275 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba292ca5-579c-4a89-b291-53bd3ef8d744" path="/var/lib/kubelet/pods/ba292ca5-579c-4a89-b291-53bd3ef8d744/volumes" Oct 09 19:57:03 crc kubenswrapper[4907]: I1009 19:57:03.405959 4907 generic.go:334] "Generic (PLEG): container finished" podID="4f6c717a-ca37-4879-babe-36221d9580fa" containerID="38dc5559724783de90161c5cbfd7e0a2d0765457fb3fbffa7dc7e6e7bac52478" exitCode=0 Oct 09 19:57:03 crc kubenswrapper[4907]: I1009 19:57:03.406086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" event={"ID":"4f6c717a-ca37-4879-babe-36221d9580fa","Type":"ContainerDied","Data":"38dc5559724783de90161c5cbfd7e0a2d0765457fb3fbffa7dc7e6e7bac52478"} Oct 09 19:57:04 crc kubenswrapper[4907]: I1009 19:57:04.868487 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:57:04 crc kubenswrapper[4907]: I1009 19:57:04.913890 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62b9l\" (UniqueName: \"kubernetes.io/projected/4f6c717a-ca37-4879-babe-36221d9580fa-kube-api-access-62b9l\") pod \"4f6c717a-ca37-4879-babe-36221d9580fa\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " Oct 09 19:57:04 crc kubenswrapper[4907]: I1009 19:57:04.913948 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-inventory\") pod \"4f6c717a-ca37-4879-babe-36221d9580fa\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " Oct 09 19:57:04 crc kubenswrapper[4907]: I1009 19:57:04.914084 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-ssh-key\") pod \"4f6c717a-ca37-4879-babe-36221d9580fa\" (UID: \"4f6c717a-ca37-4879-babe-36221d9580fa\") " Oct 09 19:57:04 crc kubenswrapper[4907]: I1009 19:57:04.928685 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6c717a-ca37-4879-babe-36221d9580fa-kube-api-access-62b9l" (OuterVolumeSpecName: "kube-api-access-62b9l") pod "4f6c717a-ca37-4879-babe-36221d9580fa" (UID: "4f6c717a-ca37-4879-babe-36221d9580fa"). InnerVolumeSpecName "kube-api-access-62b9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:57:04 crc kubenswrapper[4907]: I1009 19:57:04.940703 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f6c717a-ca37-4879-babe-36221d9580fa" (UID: "4f6c717a-ca37-4879-babe-36221d9580fa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:57:04 crc kubenswrapper[4907]: I1009 19:57:04.968651 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-inventory" (OuterVolumeSpecName: "inventory") pod "4f6c717a-ca37-4879-babe-36221d9580fa" (UID: "4f6c717a-ca37-4879-babe-36221d9580fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.015772 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.015809 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62b9l\" (UniqueName: \"kubernetes.io/projected/4f6c717a-ca37-4879-babe-36221d9580fa-kube-api-access-62b9l\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.015821 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6c717a-ca37-4879-babe-36221d9580fa-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.162354 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:57:05 crc kubenswrapper[4907]: E1009 19:57:05.162697 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.425405 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" event={"ID":"4f6c717a-ca37-4879-babe-36221d9580fa","Type":"ContainerDied","Data":"961cc1472207cbc57e1adc2383d9d0d799d4836d959c1ba6bfb8ae4077175197"} Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.425771 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="961cc1472207cbc57e1adc2383d9d0d799d4836d959c1ba6bfb8ae4077175197" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.425451 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.520371 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2"] Oct 09 19:57:05 crc kubenswrapper[4907]: E1009 19:57:05.520765 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6be656-e82a-4901-a882-eea679443872" containerName="extract-utilities" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.520781 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6be656-e82a-4901-a882-eea679443872" containerName="extract-utilities" Oct 09 19:57:05 crc kubenswrapper[4907]: E1009 19:57:05.520809 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6c717a-ca37-4879-babe-36221d9580fa" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.520816 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6c717a-ca37-4879-babe-36221d9580fa" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:05 crc kubenswrapper[4907]: E1009 19:57:05.520827 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6be656-e82a-4901-a882-eea679443872" containerName="registry-server" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.520832 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6be656-e82a-4901-a882-eea679443872" containerName="registry-server" Oct 09 19:57:05 crc kubenswrapper[4907]: E1009 19:57:05.520840 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6be656-e82a-4901-a882-eea679443872" containerName="extract-content" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.520846 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6be656-e82a-4901-a882-eea679443872" containerName="extract-content" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.521045 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6c717a-ca37-4879-babe-36221d9580fa" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.521066 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6be656-e82a-4901-a882-eea679443872" containerName="registry-server" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.521679 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.523834 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.523869 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.524069 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.524445 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.529199 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.529400 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsgkm\" (UniqueName: \"kubernetes.io/projected/05a8f3e8-9742-4c16-a3a1-2695034bf94d-kube-api-access-tsgkm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.529538 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.533201 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2"] Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.631755 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgkm\" (UniqueName: \"kubernetes.io/projected/05a8f3e8-9742-4c16-a3a1-2695034bf94d-kube-api-access-tsgkm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.631823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.631873 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.635896 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.635914 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.650089 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsgkm\" (UniqueName: \"kubernetes.io/projected/05a8f3e8-9742-4c16-a3a1-2695034bf94d-kube-api-access-tsgkm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-r42b2\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:05 crc kubenswrapper[4907]: I1009 19:57:05.842198 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:06 crc kubenswrapper[4907]: I1009 19:57:06.382060 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2"] Oct 09 19:57:06 crc kubenswrapper[4907]: I1009 19:57:06.433868 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" event={"ID":"05a8f3e8-9742-4c16-a3a1-2695034bf94d","Type":"ContainerStarted","Data":"4c5926e6691b921c00690a49460137fa0dcaef1acc84d7724c30c41a621395e3"} Oct 09 19:57:07 crc kubenswrapper[4907]: I1009 19:57:07.444347 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" event={"ID":"05a8f3e8-9742-4c16-a3a1-2695034bf94d","Type":"ContainerStarted","Data":"d79629ff4ca70545c3be5cc97af711df1e517581d7320029b82a375962c027de"} Oct 09 19:57:07 crc kubenswrapper[4907]: I1009 19:57:07.474575 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" podStartSLOduration=1.973365847 podStartE2EDuration="2.47455504s" podCreationTimestamp="2025-10-09 19:57:05 +0000 UTC" firstStartedPulling="2025-10-09 19:57:06.390009315 +0000 UTC m=+1711.921976804" lastFinishedPulling="2025-10-09 19:57:06.891198518 +0000 UTC m=+1712.423165997" observedRunningTime="2025-10-09 19:57:07.470016375 +0000 UTC m=+1713.001983874" watchObservedRunningTime="2025-10-09 19:57:07.47455504 +0000 UTC m=+1713.006522529" Oct 09 19:57:12 crc kubenswrapper[4907]: I1009 19:57:12.492200 4907 generic.go:334] "Generic (PLEG): container finished" podID="05a8f3e8-9742-4c16-a3a1-2695034bf94d" containerID="d79629ff4ca70545c3be5cc97af711df1e517581d7320029b82a375962c027de" exitCode=0 Oct 09 19:57:12 crc kubenswrapper[4907]: I1009 19:57:12.492443 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" event={"ID":"05a8f3e8-9742-4c16-a3a1-2695034bf94d","Type":"ContainerDied","Data":"d79629ff4ca70545c3be5cc97af711df1e517581d7320029b82a375962c027de"} Oct 09 19:57:13 crc kubenswrapper[4907]: I1009 19:57:13.908222 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.092427 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-ssh-key\") pod \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.092617 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-inventory\") pod \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.092712 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsgkm\" (UniqueName: \"kubernetes.io/projected/05a8f3e8-9742-4c16-a3a1-2695034bf94d-kube-api-access-tsgkm\") pod \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\" (UID: \"05a8f3e8-9742-4c16-a3a1-2695034bf94d\") " Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.112805 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a8f3e8-9742-4c16-a3a1-2695034bf94d-kube-api-access-tsgkm" (OuterVolumeSpecName: "kube-api-access-tsgkm") pod "05a8f3e8-9742-4c16-a3a1-2695034bf94d" (UID: "05a8f3e8-9742-4c16-a3a1-2695034bf94d"). InnerVolumeSpecName "kube-api-access-tsgkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.143128 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "05a8f3e8-9742-4c16-a3a1-2695034bf94d" (UID: "05a8f3e8-9742-4c16-a3a1-2695034bf94d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.149301 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-inventory" (OuterVolumeSpecName: "inventory") pod "05a8f3e8-9742-4c16-a3a1-2695034bf94d" (UID: "05a8f3e8-9742-4c16-a3a1-2695034bf94d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.195277 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.196138 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05a8f3e8-9742-4c16-a3a1-2695034bf94d-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.196192 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsgkm\" (UniqueName: \"kubernetes.io/projected/05a8f3e8-9742-4c16-a3a1-2695034bf94d-kube-api-access-tsgkm\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.513665 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" event={"ID":"05a8f3e8-9742-4c16-a3a1-2695034bf94d","Type":"ContainerDied","Data":"4c5926e6691b921c00690a49460137fa0dcaef1acc84d7724c30c41a621395e3"} Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.513713 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c5926e6691b921c00690a49460137fa0dcaef1acc84d7724c30c41a621395e3" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.514029 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-r42b2" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.584111 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4"] Oct 09 19:57:14 crc kubenswrapper[4907]: E1009 19:57:14.584519 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a8f3e8-9742-4c16-a3a1-2695034bf94d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.584531 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a8f3e8-9742-4c16-a3a1-2695034bf94d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.584694 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a8f3e8-9742-4c16-a3a1-2695034bf94d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.585289 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.611098 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.611347 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n26f\" (UniqueName: \"kubernetes.io/projected/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-kube-api-access-2n26f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.611405 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.613576 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.613753 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.613831 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.614006 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.626120 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4"] Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.712982 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.713031 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n26f\" (UniqueName: \"kubernetes.io/projected/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-kube-api-access-2n26f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.713059 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.716869 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.718909 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.736190 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n26f\" (UniqueName: \"kubernetes.io/projected/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-kube-api-access-2n26f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pqds4\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:14 crc kubenswrapper[4907]: I1009 19:57:14.931680 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:15 crc kubenswrapper[4907]: I1009 19:57:15.517147 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4"] Oct 09 19:57:16 crc kubenswrapper[4907]: I1009 19:57:16.035709 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6tdng"] Oct 09 19:57:16 crc kubenswrapper[4907]: I1009 19:57:16.046267 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6tdng"] Oct 09 19:57:16 crc kubenswrapper[4907]: I1009 19:57:16.539550 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" event={"ID":"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd","Type":"ContainerStarted","Data":"b5f0b88a1d3ef3001eb1b403bff134a27a01742e1578cfd672b60d9a9a517adc"} Oct 09 19:57:16 crc kubenswrapper[4907]: I1009 19:57:16.539930 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" event={"ID":"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd","Type":"ContainerStarted","Data":"f2eb0004bd0a01deab5b241d6552b1c91e73b371062ec2fcd87f08869e8fa078"} Oct 09 19:57:16 crc kubenswrapper[4907]: I1009 19:57:16.561252 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" podStartSLOduration=1.96727923 podStartE2EDuration="2.56123152s" podCreationTimestamp="2025-10-09 19:57:14 +0000 UTC" firstStartedPulling="2025-10-09 19:57:15.523925442 +0000 UTC m=+1721.055892931" lastFinishedPulling="2025-10-09 19:57:16.117877712 +0000 UTC m=+1721.649845221" observedRunningTime="2025-10-09 19:57:16.556423299 +0000 UTC m=+1722.088390808" watchObservedRunningTime="2025-10-09 19:57:16.56123152 +0000 UTC m=+1722.093199009" Oct 09 19:57:17 crc kubenswrapper[4907]: I1009 19:57:17.032744 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-gp72n"] Oct 09 19:57:17 crc kubenswrapper[4907]: I1009 19:57:17.046164 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-r6m6d"] Oct 09 19:57:17 crc kubenswrapper[4907]: I1009 19:57:17.054881 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-gp72n"] Oct 09 19:57:17 crc kubenswrapper[4907]: I1009 19:57:17.062846 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-r6m6d"] Oct 09 19:57:17 crc kubenswrapper[4907]: I1009 19:57:17.168257 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d623f5-216e-402f-aee1-e66c892b74f6" path="/var/lib/kubelet/pods/07d623f5-216e-402f-aee1-e66c892b74f6/volumes" Oct 09 19:57:17 crc kubenswrapper[4907]: I1009 19:57:17.168910 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd23d06-3e11-4a14-bddf-dd7e4782fd45" path="/var/lib/kubelet/pods/0bd23d06-3e11-4a14-bddf-dd7e4782fd45/volumes" Oct 09 19:57:17 crc kubenswrapper[4907]: I1009 19:57:17.169519 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4dcdac-5087-4f48-ba67-c508cf316b2b" path="/var/lib/kubelet/pods/5c4dcdac-5087-4f48-ba67-c508cf316b2b/volumes" Oct 09 19:57:20 crc kubenswrapper[4907]: I1009 19:57:20.152018 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:57:20 crc kubenswrapper[4907]: E1009 19:57:20.153119 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:57:26 crc kubenswrapper[4907]: I1009 19:57:26.043786 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e625-account-create-44ktn"] Oct 09 19:57:26 crc kubenswrapper[4907]: I1009 19:57:26.058306 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-550c-account-create-m5bhx"] Oct 09 19:57:26 crc kubenswrapper[4907]: I1009 19:57:26.071484 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-550c-account-create-m5bhx"] Oct 09 19:57:26 crc kubenswrapper[4907]: I1009 19:57:26.081773 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e625-account-create-44ktn"] Oct 09 19:57:27 crc kubenswrapper[4907]: I1009 19:57:27.027217 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fc5a-account-create-fhcs9"] Oct 09 19:57:27 crc kubenswrapper[4907]: I1009 19:57:27.036095 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fc5a-account-create-fhcs9"] Oct 09 19:57:27 crc kubenswrapper[4907]: I1009 19:57:27.163441 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26711161-fa32-4166-b646-1958af798b80" path="/var/lib/kubelet/pods/26711161-fa32-4166-b646-1958af798b80/volumes" Oct 09 19:57:27 crc kubenswrapper[4907]: I1009 19:57:27.164202 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d250ac44-116b-4ab0-9ba9-1396ea38a602" path="/var/lib/kubelet/pods/d250ac44-116b-4ab0-9ba9-1396ea38a602/volumes" Oct 09 19:57:27 crc kubenswrapper[4907]: I1009 19:57:27.164773 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f15805-573f-4e5e-9897-9ba5f8f72d28" path="/var/lib/kubelet/pods/d6f15805-573f-4e5e-9897-9ba5f8f72d28/volumes" Oct 09 19:57:32 crc kubenswrapper[4907]: I1009 19:57:32.151295 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:57:32 crc kubenswrapper[4907]: E1009 19:57:32.152263 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:57:40 crc kubenswrapper[4907]: I1009 19:57:40.263213 4907 scope.go:117] "RemoveContainer" containerID="85df2f84592e794839e3cf2e1bbad860966453e22698796a0a2d9acfbd5f24d6" Oct 09 19:57:40 crc kubenswrapper[4907]: I1009 19:57:40.316402 4907 scope.go:117] "RemoveContainer" containerID="e5a38ec84468a29c5eb5e65b35dcc6e955e84c40ab4adea53ce7879912bedc70" Oct 09 19:57:40 crc kubenswrapper[4907]: I1009 19:57:40.355592 4907 scope.go:117] "RemoveContainer" containerID="b2a247449d1e5f790005794a741e6830839884d221a4363c738eaa97b9dfc968" Oct 09 19:57:40 crc kubenswrapper[4907]: I1009 19:57:40.408778 4907 scope.go:117] "RemoveContainer" containerID="2c547b2367d46b709f3d18badcd8c0642243990baca3f07a08d1a87550e4bb08" Oct 09 19:57:40 crc kubenswrapper[4907]: I1009 19:57:40.437886 4907 scope.go:117] "RemoveContainer" containerID="d740fb03ad983bd5492d0bdc1489013183df2e1232605cb012a7127005f6087a" Oct 09 19:57:40 crc kubenswrapper[4907]: I1009 19:57:40.495651 4907 scope.go:117] "RemoveContainer" containerID="d3b3694f284983ba24a4792518cf413055575367f1043dc82dc875a1e207efd5" Oct 09 19:57:40 crc kubenswrapper[4907]: I1009 19:57:40.520929 4907 scope.go:117] "RemoveContainer" containerID="fcca4b1f90b430b4824ddb7322a68927b1229aa6edfb339ba2503f5eb026bc46" Oct 09 19:57:40 crc kubenswrapper[4907]: I1009 19:57:40.544014 4907 scope.go:117] "RemoveContainer" containerID="f0d2ffeb2382b39980ab23b8ace7986ea8a2e8c20beba1694867ef48bde67991" Oct 09 19:57:47 crc kubenswrapper[4907]: I1009 19:57:47.152099 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:57:47 crc kubenswrapper[4907]: E1009 19:57:47.153370 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:57:50 crc kubenswrapper[4907]: I1009 19:57:50.041603 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5b5ql"] Oct 09 19:57:50 crc kubenswrapper[4907]: I1009 19:57:50.057256 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5b5ql"] Oct 09 19:57:51 crc kubenswrapper[4907]: I1009 19:57:51.165109 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7241797e-6008-4959-8314-f8100841d03c" path="/var/lib/kubelet/pods/7241797e-6008-4959-8314-f8100841d03c/volumes" Oct 09 19:57:55 crc kubenswrapper[4907]: I1009 19:57:55.951593 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd" containerID="b5f0b88a1d3ef3001eb1b403bff134a27a01742e1578cfd672b60d9a9a517adc" exitCode=0 Oct 09 19:57:55 crc kubenswrapper[4907]: I1009 19:57:55.951806 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" event={"ID":"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd","Type":"ContainerDied","Data":"b5f0b88a1d3ef3001eb1b403bff134a27a01742e1578cfd672b60d9a9a517adc"} Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.412371 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.571557 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-ssh-key\") pod \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.571647 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n26f\" (UniqueName: \"kubernetes.io/projected/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-kube-api-access-2n26f\") pod \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.571686 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-inventory\") pod \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\" (UID: \"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd\") " Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.577785 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-kube-api-access-2n26f" (OuterVolumeSpecName: "kube-api-access-2n26f") pod "3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd" (UID: "3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd"). InnerVolumeSpecName "kube-api-access-2n26f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.604226 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd" (UID: "3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.604641 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-inventory" (OuterVolumeSpecName: "inventory") pod "3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd" (UID: "3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.673693 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.673730 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n26f\" (UniqueName: \"kubernetes.io/projected/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-kube-api-access-2n26f\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.673746 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.978805 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" event={"ID":"3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd","Type":"ContainerDied","Data":"f2eb0004bd0a01deab5b241d6552b1c91e73b371062ec2fcd87f08869e8fa078"} Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.979196 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2eb0004bd0a01deab5b241d6552b1c91e73b371062ec2fcd87f08869e8fa078" Oct 09 19:57:57 crc kubenswrapper[4907]: I1009 19:57:57.978931 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pqds4" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.074273 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw"] Oct 09 19:57:58 crc kubenswrapper[4907]: E1009 19:57:58.074735 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.074754 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.074924 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.075581 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.078703 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.079207 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.079364 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.079911 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.097135 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw"] Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.152007 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:57:58 crc kubenswrapper[4907]: E1009 19:57:58.152281 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.183723 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrb62\" (UniqueName: \"kubernetes.io/projected/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-kube-api-access-lrb62\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.184032 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.184243 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.285907 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.285976 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.286122 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrb62\" (UniqueName: \"kubernetes.io/projected/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-kube-api-access-lrb62\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.291904 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.301713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.302322 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrb62\" (UniqueName: \"kubernetes.io/projected/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-kube-api-access-lrb62\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w65sw\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.394022 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.964291 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw"] Oct 09 19:57:58 crc kubenswrapper[4907]: I1009 19:57:58.989340 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" event={"ID":"25257dc2-dcd5-4771-b24d-94e98cd6d8a1","Type":"ContainerStarted","Data":"d5ec4f1556c243dab59110b07e1d95ac1dafc9ced6d7dc9233a8bde25e3200a8"} Oct 09 19:58:00 crc kubenswrapper[4907]: I1009 19:57:59.999842 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" event={"ID":"25257dc2-dcd5-4771-b24d-94e98cd6d8a1","Type":"ContainerStarted","Data":"bda03cb1976e16d461d21769b212c882373b7a83af519da858130001848a7ce7"} Oct 09 19:58:00 crc kubenswrapper[4907]: I1009 19:58:00.023062 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" podStartSLOduration=1.351787799 podStartE2EDuration="2.023025234s" podCreationTimestamp="2025-10-09 19:57:58 +0000 UTC" firstStartedPulling="2025-10-09 19:57:58.974801367 +0000 UTC m=+1764.506768876" lastFinishedPulling="2025-10-09 19:57:59.646038812 +0000 UTC m=+1765.178006311" observedRunningTime="2025-10-09 19:58:00.012366223 +0000 UTC m=+1765.544333732" watchObservedRunningTime="2025-10-09 19:58:00.023025234 +0000 UTC m=+1765.554992773" Oct 09 19:58:13 crc kubenswrapper[4907]: I1009 19:58:13.151872 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:58:13 crc kubenswrapper[4907]: E1009 19:58:13.152737 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:58:14 crc kubenswrapper[4907]: I1009 19:58:14.058178 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-psdhs"] Oct 09 19:58:14 crc kubenswrapper[4907]: I1009 19:58:14.069407 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-psdhs"] Oct 09 19:58:15 crc kubenswrapper[4907]: I1009 19:58:15.168996 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3525ecd6-fd9f-47bd-b83d-7eb303d3032c" path="/var/lib/kubelet/pods/3525ecd6-fd9f-47bd-b83d-7eb303d3032c/volumes" Oct 09 19:58:16 crc kubenswrapper[4907]: I1009 19:58:16.035663 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6v72h"] Oct 09 19:58:16 crc kubenswrapper[4907]: I1009 19:58:16.046122 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6v72h"] Oct 09 19:58:17 crc kubenswrapper[4907]: I1009 19:58:17.167562 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4360d8b6-f761-4e20-acd0-3cb6580dd756" path="/var/lib/kubelet/pods/4360d8b6-f761-4e20-acd0-3cb6580dd756/volumes" Oct 09 19:58:26 crc kubenswrapper[4907]: I1009 19:58:26.151772 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:58:26 crc kubenswrapper[4907]: E1009 19:58:26.152839 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:58:40 crc kubenswrapper[4907]: I1009 19:58:40.152277 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:58:40 crc kubenswrapper[4907]: E1009 19:58:40.153518 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:58:40 crc kubenswrapper[4907]: I1009 19:58:40.694054 4907 scope.go:117] "RemoveContainer" containerID="f847f77253380e5ef8d7eecf30888d4d43e1510e07098e469eebea69a9aa68ec" Oct 09 19:58:40 crc kubenswrapper[4907]: I1009 19:58:40.742668 4907 scope.go:117] "RemoveContainer" containerID="bb9fbdd7e0b4569504bebb45b35f4e77651f747819e5c6d15b631d8f4b689e15" Oct 09 19:58:40 crc kubenswrapper[4907]: I1009 19:58:40.815781 4907 scope.go:117] "RemoveContainer" containerID="6c205bb8dd05ab038799b5a521785fff3a16b781e2a2f9d8aa7e868cba6b1a70" Oct 09 19:58:53 crc kubenswrapper[4907]: I1009 19:58:53.153000 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:58:53 crc kubenswrapper[4907]: E1009 19:58:53.154294 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:58:56 crc kubenswrapper[4907]: I1009 19:58:56.683859 4907 generic.go:334] "Generic (PLEG): container finished" podID="25257dc2-dcd5-4771-b24d-94e98cd6d8a1" containerID="bda03cb1976e16d461d21769b212c882373b7a83af519da858130001848a7ce7" exitCode=2 Oct 09 19:58:56 crc kubenswrapper[4907]: I1009 19:58:56.683973 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" event={"ID":"25257dc2-dcd5-4771-b24d-94e98cd6d8a1","Type":"ContainerDied","Data":"bda03cb1976e16d461d21769b212c882373b7a83af519da858130001848a7ce7"} Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.039382 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vttg4"] Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.045065 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vttg4"] Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.133165 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.159683 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-ssh-key\") pod \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.159726 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrb62\" (UniqueName: \"kubernetes.io/projected/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-kube-api-access-lrb62\") pod \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.159766 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-inventory\") pod \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\" (UID: \"25257dc2-dcd5-4771-b24d-94e98cd6d8a1\") " Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.169732 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-kube-api-access-lrb62" (OuterVolumeSpecName: "kube-api-access-lrb62") pod "25257dc2-dcd5-4771-b24d-94e98cd6d8a1" (UID: "25257dc2-dcd5-4771-b24d-94e98cd6d8a1"). InnerVolumeSpecName "kube-api-access-lrb62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.188345 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "25257dc2-dcd5-4771-b24d-94e98cd6d8a1" (UID: "25257dc2-dcd5-4771-b24d-94e98cd6d8a1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.200751 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-inventory" (OuterVolumeSpecName: "inventory") pod "25257dc2-dcd5-4771-b24d-94e98cd6d8a1" (UID: "25257dc2-dcd5-4771-b24d-94e98cd6d8a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.262912 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.262950 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrb62\" (UniqueName: \"kubernetes.io/projected/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-kube-api-access-lrb62\") on node \"crc\" DevicePath \"\"" Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.262967 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25257dc2-dcd5-4771-b24d-94e98cd6d8a1-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.715807 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" event={"ID":"25257dc2-dcd5-4771-b24d-94e98cd6d8a1","Type":"ContainerDied","Data":"d5ec4f1556c243dab59110b07e1d95ac1dafc9ced6d7dc9233a8bde25e3200a8"} Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.716166 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5ec4f1556c243dab59110b07e1d95ac1dafc9ced6d7dc9233a8bde25e3200a8" Oct 09 19:58:58 crc kubenswrapper[4907]: I1009 19:58:58.715857 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w65sw" Oct 09 19:58:59 crc kubenswrapper[4907]: I1009 19:58:59.162569 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9565b208-b543-4da1-ba7e-fcf358d55bdb" path="/var/lib/kubelet/pods/9565b208-b543-4da1-ba7e-fcf358d55bdb/volumes" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.029938 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s"] Oct 09 19:59:05 crc kubenswrapper[4907]: E1009 19:59:05.031062 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25257dc2-dcd5-4771-b24d-94e98cd6d8a1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.031082 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="25257dc2-dcd5-4771-b24d-94e98cd6d8a1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.031336 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="25257dc2-dcd5-4771-b24d-94e98cd6d8a1" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.032306 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.034586 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.034601 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.035027 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.043798 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.052923 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s"] Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.096856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqkth\" (UniqueName: \"kubernetes.io/projected/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-kube-api-access-jqkth\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.096965 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.097096 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.159716 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:59:05 crc kubenswrapper[4907]: E1009 19:59:05.159981 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.199047 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.199163 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.199288 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqkth\" (UniqueName: \"kubernetes.io/projected/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-kube-api-access-jqkth\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.207908 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.219260 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.221340 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqkth\" (UniqueName: \"kubernetes.io/projected/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-kube-api-access-jqkth\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.359711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.898976 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s"] Oct 09 19:59:05 crc kubenswrapper[4907]: I1009 19:59:05.903442 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 19:59:06 crc kubenswrapper[4907]: I1009 19:59:06.789454 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" event={"ID":"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9","Type":"ContainerStarted","Data":"2afa60db8cd067fca2e3dd1d32b55a8f0e42ac6a68a6572148b375984a380ea6"} Oct 09 19:59:06 crc kubenswrapper[4907]: I1009 19:59:06.789787 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" event={"ID":"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9","Type":"ContainerStarted","Data":"e60b3008f7488f520fa645b25a1236d6febd27cf4191c002dab07bd67895cd8d"} Oct 09 19:59:06 crc kubenswrapper[4907]: I1009 19:59:06.821439 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" podStartSLOduration=1.220333456 podStartE2EDuration="1.821418643s" podCreationTimestamp="2025-10-09 19:59:05 +0000 UTC" firstStartedPulling="2025-10-09 19:59:05.903222433 +0000 UTC m=+1831.435189922" lastFinishedPulling="2025-10-09 19:59:06.50430762 +0000 UTC m=+1832.036275109" observedRunningTime="2025-10-09 19:59:06.813789249 +0000 UTC m=+1832.345756738" watchObservedRunningTime="2025-10-09 19:59:06.821418643 +0000 UTC m=+1832.353386132" Oct 09 19:59:16 crc kubenswrapper[4907]: I1009 19:59:16.151262 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 19:59:16 crc kubenswrapper[4907]: I1009 19:59:16.878442 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"ec394429780b5e55358aca0ac686bbdd764b46de0ddd59d4a354bb6ae732345d"} Oct 09 19:59:40 crc kubenswrapper[4907]: I1009 19:59:40.965674 4907 scope.go:117] "RemoveContainer" containerID="c9be4375f904a7e34c412ca79227428d7391399f69c83bc8e6f2e678e17f9b78" Oct 09 19:59:53 crc kubenswrapper[4907]: I1009 19:59:53.264788 4907 generic.go:334] "Generic (PLEG): container finished" podID="f92986eb-ccf3-4d54-a1e6-5f4168a4bab9" containerID="2afa60db8cd067fca2e3dd1d32b55a8f0e42ac6a68a6572148b375984a380ea6" exitCode=0 Oct 09 19:59:53 crc kubenswrapper[4907]: I1009 19:59:53.264857 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" event={"ID":"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9","Type":"ContainerDied","Data":"2afa60db8cd067fca2e3dd1d32b55a8f0e42ac6a68a6572148b375984a380ea6"} Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.708617 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.892150 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-inventory\") pod \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.892332 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqkth\" (UniqueName: \"kubernetes.io/projected/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-kube-api-access-jqkth\") pod \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.892498 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-ssh-key\") pod \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\" (UID: \"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9\") " Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.900644 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-kube-api-access-jqkth" (OuterVolumeSpecName: "kube-api-access-jqkth") pod "f92986eb-ccf3-4d54-a1e6-5f4168a4bab9" (UID: "f92986eb-ccf3-4d54-a1e6-5f4168a4bab9"). InnerVolumeSpecName "kube-api-access-jqkth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.924042 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f92986eb-ccf3-4d54-a1e6-5f4168a4bab9" (UID: "f92986eb-ccf3-4d54-a1e6-5f4168a4bab9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.929107 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-inventory" (OuterVolumeSpecName: "inventory") pod "f92986eb-ccf3-4d54-a1e6-5f4168a4bab9" (UID: "f92986eb-ccf3-4d54-a1e6-5f4168a4bab9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.994648 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.994678 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 19:59:54 crc kubenswrapper[4907]: I1009 19:59:54.994689 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqkth\" (UniqueName: \"kubernetes.io/projected/f92986eb-ccf3-4d54-a1e6-5f4168a4bab9-kube-api-access-jqkth\") on node \"crc\" DevicePath \"\"" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.297628 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" event={"ID":"f92986eb-ccf3-4d54-a1e6-5f4168a4bab9","Type":"ContainerDied","Data":"e60b3008f7488f520fa645b25a1236d6febd27cf4191c002dab07bd67895cd8d"} Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.297670 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e60b3008f7488f520fa645b25a1236d6febd27cf4191c002dab07bd67895cd8d" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.297721 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.373054 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-g4gz2"] Oct 09 19:59:55 crc kubenswrapper[4907]: E1009 19:59:55.373545 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92986eb-ccf3-4d54-a1e6-5f4168a4bab9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.373567 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92986eb-ccf3-4d54-a1e6-5f4168a4bab9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.373833 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92986eb-ccf3-4d54-a1e6-5f4168a4bab9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.374653 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.376447 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.376587 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.376722 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.376737 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.382245 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-g4gz2"] Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.503407 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.503554 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbsf9\" (UniqueName: \"kubernetes.io/projected/68544a9c-d9f1-42c1-8499-83289992b246-kube-api-access-jbsf9\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.503726 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.605778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.605881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.605941 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbsf9\" (UniqueName: \"kubernetes.io/projected/68544a9c-d9f1-42c1-8499-83289992b246-kube-api-access-jbsf9\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.619755 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.620934 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.628747 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbsf9\" (UniqueName: \"kubernetes.io/projected/68544a9c-d9f1-42c1-8499-83289992b246-kube-api-access-jbsf9\") pod \"ssh-known-hosts-edpm-deployment-g4gz2\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:55 crc kubenswrapper[4907]: I1009 19:59:55.706839 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 19:59:56 crc kubenswrapper[4907]: I1009 19:59:56.261588 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-g4gz2"] Oct 09 19:59:56 crc kubenswrapper[4907]: I1009 19:59:56.306710 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" event={"ID":"68544a9c-d9f1-42c1-8499-83289992b246","Type":"ContainerStarted","Data":"c00ae2e12142f4d837a623d79ad71c73cadc1a5b738244e29f0ee25269405323"} Oct 09 19:59:57 crc kubenswrapper[4907]: I1009 19:59:57.318886 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" event={"ID":"68544a9c-d9f1-42c1-8499-83289992b246","Type":"ContainerStarted","Data":"4bf349374f90cec530d72fb304b4a5244ab44962c0eb7eee0c521c91c900585c"} Oct 09 19:59:57 crc kubenswrapper[4907]: I1009 19:59:57.337126 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" podStartSLOduration=1.769000414 podStartE2EDuration="2.337108275s" podCreationTimestamp="2025-10-09 19:59:55 +0000 UTC" firstStartedPulling="2025-10-09 19:59:56.274613101 +0000 UTC m=+1881.806580590" lastFinishedPulling="2025-10-09 19:59:56.842720942 +0000 UTC m=+1882.374688451" observedRunningTime="2025-10-09 19:59:57.332911618 +0000 UTC m=+1882.864879117" watchObservedRunningTime="2025-10-09 19:59:57.337108275 +0000 UTC m=+1882.869075784" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.151197 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g"] Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.153342 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.156259 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.156343 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.161365 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g"] Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.292680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jbv\" (UniqueName: \"kubernetes.io/projected/71a4c366-ec27-4e1f-bd48-a68f7b54763b-kube-api-access-j2jbv\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.292768 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71a4c366-ec27-4e1f-bd48-a68f7b54763b-secret-volume\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.292811 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71a4c366-ec27-4e1f-bd48-a68f7b54763b-config-volume\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.394997 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jbv\" (UniqueName: \"kubernetes.io/projected/71a4c366-ec27-4e1f-bd48-a68f7b54763b-kube-api-access-j2jbv\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.395066 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71a4c366-ec27-4e1f-bd48-a68f7b54763b-secret-volume\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.395100 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71a4c366-ec27-4e1f-bd48-a68f7b54763b-config-volume\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.396009 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71a4c366-ec27-4e1f-bd48-a68f7b54763b-config-volume\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.410155 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71a4c366-ec27-4e1f-bd48-a68f7b54763b-secret-volume\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.422246 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jbv\" (UniqueName: \"kubernetes.io/projected/71a4c366-ec27-4e1f-bd48-a68f7b54763b-kube-api-access-j2jbv\") pod \"collect-profiles-29334000-4ql6g\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.482503 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:00 crc kubenswrapper[4907]: I1009 20:00:00.937829 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g"] Oct 09 20:00:00 crc kubenswrapper[4907]: W1009 20:00:00.943928 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a4c366_ec27_4e1f_bd48_a68f7b54763b.slice/crio-68e71cbd6baeff5017489acbc004a2aa7ab5a8f607a6d175f6d0be169f72f6d6 WatchSource:0}: Error finding container 68e71cbd6baeff5017489acbc004a2aa7ab5a8f607a6d175f6d0be169f72f6d6: Status 404 returned error can't find the container with id 68e71cbd6baeff5017489acbc004a2aa7ab5a8f607a6d175f6d0be169f72f6d6 Oct 09 20:00:01 crc kubenswrapper[4907]: I1009 20:00:01.356439 4907 generic.go:334] "Generic (PLEG): container finished" podID="71a4c366-ec27-4e1f-bd48-a68f7b54763b" containerID="f9f77226627b3682d3d62351c709c5bc6c5c54893e253b0ddc08c0c57c653341" exitCode=0 Oct 09 20:00:01 crc kubenswrapper[4907]: I1009 20:00:01.356770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" event={"ID":"71a4c366-ec27-4e1f-bd48-a68f7b54763b","Type":"ContainerDied","Data":"f9f77226627b3682d3d62351c709c5bc6c5c54893e253b0ddc08c0c57c653341"} Oct 09 20:00:01 crc kubenswrapper[4907]: I1009 20:00:01.356800 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" event={"ID":"71a4c366-ec27-4e1f-bd48-a68f7b54763b","Type":"ContainerStarted","Data":"68e71cbd6baeff5017489acbc004a2aa7ab5a8f607a6d175f6d0be169f72f6d6"} Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.716300 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.847511 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2jbv\" (UniqueName: \"kubernetes.io/projected/71a4c366-ec27-4e1f-bd48-a68f7b54763b-kube-api-access-j2jbv\") pod \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.847749 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71a4c366-ec27-4e1f-bd48-a68f7b54763b-secret-volume\") pod \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.847801 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71a4c366-ec27-4e1f-bd48-a68f7b54763b-config-volume\") pod \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\" (UID: \"71a4c366-ec27-4e1f-bd48-a68f7b54763b\") " Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.848732 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a4c366-ec27-4e1f-bd48-a68f7b54763b-config-volume" (OuterVolumeSpecName: "config-volume") pod "71a4c366-ec27-4e1f-bd48-a68f7b54763b" (UID: "71a4c366-ec27-4e1f-bd48-a68f7b54763b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.856703 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a4c366-ec27-4e1f-bd48-a68f7b54763b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "71a4c366-ec27-4e1f-bd48-a68f7b54763b" (UID: "71a4c366-ec27-4e1f-bd48-a68f7b54763b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.857606 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a4c366-ec27-4e1f-bd48-a68f7b54763b-kube-api-access-j2jbv" (OuterVolumeSpecName: "kube-api-access-j2jbv") pod "71a4c366-ec27-4e1f-bd48-a68f7b54763b" (UID: "71a4c366-ec27-4e1f-bd48-a68f7b54763b"). InnerVolumeSpecName "kube-api-access-j2jbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.949797 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2jbv\" (UniqueName: \"kubernetes.io/projected/71a4c366-ec27-4e1f-bd48-a68f7b54763b-kube-api-access-j2jbv\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.949838 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71a4c366-ec27-4e1f-bd48-a68f7b54763b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:02 crc kubenswrapper[4907]: I1009 20:00:02.949853 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71a4c366-ec27-4e1f-bd48-a68f7b54763b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:03 crc kubenswrapper[4907]: I1009 20:00:03.376191 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" event={"ID":"71a4c366-ec27-4e1f-bd48-a68f7b54763b","Type":"ContainerDied","Data":"68e71cbd6baeff5017489acbc004a2aa7ab5a8f607a6d175f6d0be169f72f6d6"} Oct 09 20:00:03 crc kubenswrapper[4907]: I1009 20:00:03.376285 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334000-4ql6g" Oct 09 20:00:03 crc kubenswrapper[4907]: I1009 20:00:03.376292 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e71cbd6baeff5017489acbc004a2aa7ab5a8f607a6d175f6d0be169f72f6d6" Oct 09 20:00:04 crc kubenswrapper[4907]: I1009 20:00:04.388576 4907 generic.go:334] "Generic (PLEG): container finished" podID="68544a9c-d9f1-42c1-8499-83289992b246" containerID="4bf349374f90cec530d72fb304b4a5244ab44962c0eb7eee0c521c91c900585c" exitCode=0 Oct 09 20:00:04 crc kubenswrapper[4907]: I1009 20:00:04.388657 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" event={"ID":"68544a9c-d9f1-42c1-8499-83289992b246","Type":"ContainerDied","Data":"4bf349374f90cec530d72fb304b4a5244ab44962c0eb7eee0c521c91c900585c"} Oct 09 20:00:05 crc kubenswrapper[4907]: I1009 20:00:05.887207 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 20:00:05 crc kubenswrapper[4907]: I1009 20:00:05.960280 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-inventory-0\") pod \"68544a9c-d9f1-42c1-8499-83289992b246\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " Oct 09 20:00:05 crc kubenswrapper[4907]: I1009 20:00:05.960348 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-ssh-key-openstack-edpm-ipam\") pod \"68544a9c-d9f1-42c1-8499-83289992b246\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " Oct 09 20:00:05 crc kubenswrapper[4907]: I1009 20:00:05.960799 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbsf9\" (UniqueName: \"kubernetes.io/projected/68544a9c-d9f1-42c1-8499-83289992b246-kube-api-access-jbsf9\") pod \"68544a9c-d9f1-42c1-8499-83289992b246\" (UID: \"68544a9c-d9f1-42c1-8499-83289992b246\") " Oct 09 20:00:05 crc kubenswrapper[4907]: I1009 20:00:05.971724 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68544a9c-d9f1-42c1-8499-83289992b246-kube-api-access-jbsf9" (OuterVolumeSpecName: "kube-api-access-jbsf9") pod "68544a9c-d9f1-42c1-8499-83289992b246" (UID: "68544a9c-d9f1-42c1-8499-83289992b246"). InnerVolumeSpecName "kube-api-access-jbsf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.007269 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "68544a9c-d9f1-42c1-8499-83289992b246" (UID: "68544a9c-d9f1-42c1-8499-83289992b246"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.008003 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "68544a9c-d9f1-42c1-8499-83289992b246" (UID: "68544a9c-d9f1-42c1-8499-83289992b246"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.063335 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbsf9\" (UniqueName: \"kubernetes.io/projected/68544a9c-d9f1-42c1-8499-83289992b246-kube-api-access-jbsf9\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.063379 4907 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.063392 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/68544a9c-d9f1-42c1-8499-83289992b246-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.412816 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" event={"ID":"68544a9c-d9f1-42c1-8499-83289992b246","Type":"ContainerDied","Data":"c00ae2e12142f4d837a623d79ad71c73cadc1a5b738244e29f0ee25269405323"} Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.412880 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00ae2e12142f4d837a623d79ad71c73cadc1a5b738244e29f0ee25269405323" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.412959 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-g4gz2" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.480219 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw"] Oct 09 20:00:06 crc kubenswrapper[4907]: E1009 20:00:06.480614 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a4c366-ec27-4e1f-bd48-a68f7b54763b" containerName="collect-profiles" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.480636 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a4c366-ec27-4e1f-bd48-a68f7b54763b" containerName="collect-profiles" Oct 09 20:00:06 crc kubenswrapper[4907]: E1009 20:00:06.480653 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68544a9c-d9f1-42c1-8499-83289992b246" containerName="ssh-known-hosts-edpm-deployment" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.480661 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="68544a9c-d9f1-42c1-8499-83289992b246" containerName="ssh-known-hosts-edpm-deployment" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.480890 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="68544a9c-d9f1-42c1-8499-83289992b246" containerName="ssh-known-hosts-edpm-deployment" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.480916 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a4c366-ec27-4e1f-bd48-a68f7b54763b" containerName="collect-profiles" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.481648 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.483521 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.483996 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.486980 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.487261 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.490836 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw"] Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.571435 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.571552 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmmsc\" (UniqueName: \"kubernetes.io/projected/11f08769-69d8-4b65-8684-c132bd006797-kube-api-access-gmmsc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.571650 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.674024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.674447 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmmsc\" (UniqueName: \"kubernetes.io/projected/11f08769-69d8-4b65-8684-c132bd006797-kube-api-access-gmmsc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.674763 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.679557 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.679562 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.702096 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmmsc\" (UniqueName: \"kubernetes.io/projected/11f08769-69d8-4b65-8684-c132bd006797-kube-api-access-gmmsc\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7b6rw\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:06 crc kubenswrapper[4907]: I1009 20:00:06.809081 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:07 crc kubenswrapper[4907]: I1009 20:00:07.375979 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw"] Oct 09 20:00:07 crc kubenswrapper[4907]: I1009 20:00:07.424206 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" event={"ID":"11f08769-69d8-4b65-8684-c132bd006797","Type":"ContainerStarted","Data":"5dc0bc3273bb1d30f5b1a3ec7a64ff9efbd6a31ba200c965fe79de0f5660f646"} Oct 09 20:00:08 crc kubenswrapper[4907]: I1009 20:00:08.439181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" event={"ID":"11f08769-69d8-4b65-8684-c132bd006797","Type":"ContainerStarted","Data":"14e2a330b30c19055c6048dc70d16f345f7d92bb87ac064ff504320e26ffa692"} Oct 09 20:00:08 crc kubenswrapper[4907]: I1009 20:00:08.458662 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" podStartSLOduration=1.7099015 podStartE2EDuration="2.458632319s" podCreationTimestamp="2025-10-09 20:00:06 +0000 UTC" firstStartedPulling="2025-10-09 20:00:07.382912151 +0000 UTC m=+1892.914879650" lastFinishedPulling="2025-10-09 20:00:08.13164294 +0000 UTC m=+1893.663610469" observedRunningTime="2025-10-09 20:00:08.455080239 +0000 UTC m=+1893.987047738" watchObservedRunningTime="2025-10-09 20:00:08.458632319 +0000 UTC m=+1893.990599798" Oct 09 20:00:17 crc kubenswrapper[4907]: I1009 20:00:17.534992 4907 generic.go:334] "Generic (PLEG): container finished" podID="11f08769-69d8-4b65-8684-c132bd006797" containerID="14e2a330b30c19055c6048dc70d16f345f7d92bb87ac064ff504320e26ffa692" exitCode=0 Oct 09 20:00:17 crc kubenswrapper[4907]: I1009 20:00:17.535044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" event={"ID":"11f08769-69d8-4b65-8684-c132bd006797","Type":"ContainerDied","Data":"14e2a330b30c19055c6048dc70d16f345f7d92bb87ac064ff504320e26ffa692"} Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.074326 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.228801 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-ssh-key\") pod \"11f08769-69d8-4b65-8684-c132bd006797\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.229050 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmmsc\" (UniqueName: \"kubernetes.io/projected/11f08769-69d8-4b65-8684-c132bd006797-kube-api-access-gmmsc\") pod \"11f08769-69d8-4b65-8684-c132bd006797\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.229175 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-inventory\") pod \"11f08769-69d8-4b65-8684-c132bd006797\" (UID: \"11f08769-69d8-4b65-8684-c132bd006797\") " Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.234746 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f08769-69d8-4b65-8684-c132bd006797-kube-api-access-gmmsc" (OuterVolumeSpecName: "kube-api-access-gmmsc") pod "11f08769-69d8-4b65-8684-c132bd006797" (UID: "11f08769-69d8-4b65-8684-c132bd006797"). InnerVolumeSpecName "kube-api-access-gmmsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.256273 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "11f08769-69d8-4b65-8684-c132bd006797" (UID: "11f08769-69d8-4b65-8684-c132bd006797"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.272454 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-inventory" (OuterVolumeSpecName: "inventory") pod "11f08769-69d8-4b65-8684-c132bd006797" (UID: "11f08769-69d8-4b65-8684-c132bd006797"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.332538 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.332578 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11f08769-69d8-4b65-8684-c132bd006797-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.332591 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmmsc\" (UniqueName: \"kubernetes.io/projected/11f08769-69d8-4b65-8684-c132bd006797-kube-api-access-gmmsc\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.567858 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" event={"ID":"11f08769-69d8-4b65-8684-c132bd006797","Type":"ContainerDied","Data":"5dc0bc3273bb1d30f5b1a3ec7a64ff9efbd6a31ba200c965fe79de0f5660f646"} Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.568188 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dc0bc3273bb1d30f5b1a3ec7a64ff9efbd6a31ba200c965fe79de0f5660f646" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.567977 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7b6rw" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.676305 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx"] Oct 09 20:00:19 crc kubenswrapper[4907]: E1009 20:00:19.676978 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f08769-69d8-4b65-8684-c132bd006797" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.677049 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f08769-69d8-4b65-8684-c132bd006797" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.677507 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f08769-69d8-4b65-8684-c132bd006797" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.678372 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.681980 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.682108 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.682350 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.682666 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.694847 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx"] Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.840897 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.841162 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.841452 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxtr2\" (UniqueName: \"kubernetes.io/projected/919363ff-2e8b-4837-828e-b5d15a180260-kube-api-access-qxtr2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.943184 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxtr2\" (UniqueName: \"kubernetes.io/projected/919363ff-2e8b-4837-828e-b5d15a180260-kube-api-access-qxtr2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.943481 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.943565 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.949513 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.951840 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.972832 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxtr2\" (UniqueName: \"kubernetes.io/projected/919363ff-2e8b-4837-828e-b5d15a180260-kube-api-access-qxtr2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:19 crc kubenswrapper[4907]: I1009 20:00:19.997711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:20 crc kubenswrapper[4907]: I1009 20:00:20.612028 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx"] Oct 09 20:00:21 crc kubenswrapper[4907]: I1009 20:00:21.599625 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" event={"ID":"919363ff-2e8b-4837-828e-b5d15a180260","Type":"ContainerStarted","Data":"2355cedcbc2da72460272e763f241766014838677536ce37d58d3778b70c7e24"} Oct 09 20:00:22 crc kubenswrapper[4907]: I1009 20:00:22.617622 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" event={"ID":"919363ff-2e8b-4837-828e-b5d15a180260","Type":"ContainerStarted","Data":"1acd13c6952377c38883dd6a1f42b1ac11a6c660b4ef88997ec670f8de146382"} Oct 09 20:00:22 crc kubenswrapper[4907]: I1009 20:00:22.645921 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" podStartSLOduration=2.254950759 podStartE2EDuration="3.645889049s" podCreationTimestamp="2025-10-09 20:00:19 +0000 UTC" firstStartedPulling="2025-10-09 20:00:20.622789085 +0000 UTC m=+1906.154756574" lastFinishedPulling="2025-10-09 20:00:22.013727365 +0000 UTC m=+1907.545694864" observedRunningTime="2025-10-09 20:00:22.635619829 +0000 UTC m=+1908.167587358" watchObservedRunningTime="2025-10-09 20:00:22.645889049 +0000 UTC m=+1908.177856578" Oct 09 20:00:32 crc kubenswrapper[4907]: I1009 20:00:32.725857 4907 generic.go:334] "Generic (PLEG): container finished" podID="919363ff-2e8b-4837-828e-b5d15a180260" containerID="1acd13c6952377c38883dd6a1f42b1ac11a6c660b4ef88997ec670f8de146382" exitCode=0 Oct 09 20:00:32 crc kubenswrapper[4907]: I1009 20:00:32.725952 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" event={"ID":"919363ff-2e8b-4837-828e-b5d15a180260","Type":"ContainerDied","Data":"1acd13c6952377c38883dd6a1f42b1ac11a6c660b4ef88997ec670f8de146382"} Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.134248 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.263875 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxtr2\" (UniqueName: \"kubernetes.io/projected/919363ff-2e8b-4837-828e-b5d15a180260-kube-api-access-qxtr2\") pod \"919363ff-2e8b-4837-828e-b5d15a180260\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.264035 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-ssh-key\") pod \"919363ff-2e8b-4837-828e-b5d15a180260\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.264086 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-inventory\") pod \"919363ff-2e8b-4837-828e-b5d15a180260\" (UID: \"919363ff-2e8b-4837-828e-b5d15a180260\") " Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.311272 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919363ff-2e8b-4837-828e-b5d15a180260-kube-api-access-qxtr2" (OuterVolumeSpecName: "kube-api-access-qxtr2") pod "919363ff-2e8b-4837-828e-b5d15a180260" (UID: "919363ff-2e8b-4837-828e-b5d15a180260"). InnerVolumeSpecName "kube-api-access-qxtr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.317074 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-inventory" (OuterVolumeSpecName: "inventory") pod "919363ff-2e8b-4837-828e-b5d15a180260" (UID: "919363ff-2e8b-4837-828e-b5d15a180260"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.336626 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "919363ff-2e8b-4837-828e-b5d15a180260" (UID: "919363ff-2e8b-4837-828e-b5d15a180260"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.368362 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.368402 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxtr2\" (UniqueName: \"kubernetes.io/projected/919363ff-2e8b-4837-828e-b5d15a180260-kube-api-access-qxtr2\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.368419 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/919363ff-2e8b-4837-828e-b5d15a180260-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.752456 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" event={"ID":"919363ff-2e8b-4837-828e-b5d15a180260","Type":"ContainerDied","Data":"2355cedcbc2da72460272e763f241766014838677536ce37d58d3778b70c7e24"} Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.752998 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2355cedcbc2da72460272e763f241766014838677536ce37d58d3778b70c7e24" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.753066 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.846458 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v"] Oct 09 20:00:34 crc kubenswrapper[4907]: E1009 20:00:34.847125 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919363ff-2e8b-4837-828e-b5d15a180260" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.847155 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="919363ff-2e8b-4837-828e-b5d15a180260" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.847522 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="919363ff-2e8b-4837-828e-b5d15a180260" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.848535 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.852565 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.853554 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.853644 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.853970 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.854761 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.855045 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.855196 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.855289 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.870762 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v"] Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878138 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878196 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878254 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878298 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878351 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878383 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878404 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878432 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878503 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878542 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878570 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878602 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878626 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cvw9\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-kube-api-access-6cvw9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.878656 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980333 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980416 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980451 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980531 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980567 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980617 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980689 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980721 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980748 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cvw9\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-kube-api-access-6cvw9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980781 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980844 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.980872 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.985119 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.987237 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.987325 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.991384 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.991835 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.992092 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.992389 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.993026 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.993258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.993452 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.993626 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.993697 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:34 crc kubenswrapper[4907]: I1009 20:00:34.995094 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:35 crc kubenswrapper[4907]: I1009 20:00:35.005555 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cvw9\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-kube-api-access-6cvw9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-rf68v\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:35 crc kubenswrapper[4907]: I1009 20:00:35.185534 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:00:35 crc kubenswrapper[4907]: I1009 20:00:35.190938 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:00:35 crc kubenswrapper[4907]: I1009 20:00:35.796870 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v"] Oct 09 20:00:36 crc kubenswrapper[4907]: I1009 20:00:36.312229 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:00:36 crc kubenswrapper[4907]: I1009 20:00:36.776198 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" event={"ID":"77c2e1e4-f07b-4a72-b68d-661856abd621","Type":"ContainerStarted","Data":"60fc9e5f601894577df2133fdcf04dad2ff834b077d629f935da677e86d6671f"} Oct 09 20:00:36 crc kubenswrapper[4907]: I1009 20:00:36.776489 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" event={"ID":"77c2e1e4-f07b-4a72-b68d-661856abd621","Type":"ContainerStarted","Data":"edc8e9211ecb9feb7c1adb771dc5236bae8592c0d7612bf7c903843d1f62dbf9"} Oct 09 20:00:36 crc kubenswrapper[4907]: I1009 20:00:36.804616 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" podStartSLOduration=2.283845447 podStartE2EDuration="2.804589628s" podCreationTimestamp="2025-10-09 20:00:34 +0000 UTC" firstStartedPulling="2025-10-09 20:00:35.789272861 +0000 UTC m=+1921.321240360" lastFinishedPulling="2025-10-09 20:00:36.310017052 +0000 UTC m=+1921.841984541" observedRunningTime="2025-10-09 20:00:36.797323314 +0000 UTC m=+1922.329290863" watchObservedRunningTime="2025-10-09 20:00:36.804589628 +0000 UTC m=+1922.336557157" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.146741 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29334001-d7wk2"] Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.148409 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.178140 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29334001-d7wk2"] Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.308390 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-fernet-keys\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.308802 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-config-data\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.309229 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-combined-ca-bundle\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.309347 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w55xf\" (UniqueName: \"kubernetes.io/projected/e44476e3-d4b7-4c73-a478-9860df9f1d22-kube-api-access-w55xf\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.411150 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-combined-ca-bundle\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.411211 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w55xf\" (UniqueName: \"kubernetes.io/projected/e44476e3-d4b7-4c73-a478-9860df9f1d22-kube-api-access-w55xf\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.411248 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-fernet-keys\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.411288 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-config-data\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.417754 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-config-data\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.418563 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-fernet-keys\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.428684 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-combined-ca-bundle\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.433093 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w55xf\" (UniqueName: \"kubernetes.io/projected/e44476e3-d4b7-4c73-a478-9860df9f1d22-kube-api-access-w55xf\") pod \"keystone-cron-29334001-d7wk2\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.475260 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:00 crc kubenswrapper[4907]: I1009 20:01:00.923114 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29334001-d7wk2"] Oct 09 20:01:01 crc kubenswrapper[4907]: I1009 20:01:01.035613 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334001-d7wk2" event={"ID":"e44476e3-d4b7-4c73-a478-9860df9f1d22","Type":"ContainerStarted","Data":"c52089e6ba43867cdaaf71f2f88b55546cde33c7b47bce0f071e8b7468a6bf6e"} Oct 09 20:01:02 crc kubenswrapper[4907]: I1009 20:01:02.047504 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334001-d7wk2" event={"ID":"e44476e3-d4b7-4c73-a478-9860df9f1d22","Type":"ContainerStarted","Data":"c5febb4d1d8aedeb425089cd07ccf6acc2e91e9eab1dd268aac613977f34a488"} Oct 09 20:01:02 crc kubenswrapper[4907]: I1009 20:01:02.067646 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29334001-d7wk2" podStartSLOduration=2.067628325 podStartE2EDuration="2.067628325s" podCreationTimestamp="2025-10-09 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 20:01:02.061566002 +0000 UTC m=+1947.593533491" watchObservedRunningTime="2025-10-09 20:01:02.067628325 +0000 UTC m=+1947.599595814" Oct 09 20:01:04 crc kubenswrapper[4907]: I1009 20:01:04.073523 4907 generic.go:334] "Generic (PLEG): container finished" podID="e44476e3-d4b7-4c73-a478-9860df9f1d22" containerID="c5febb4d1d8aedeb425089cd07ccf6acc2e91e9eab1dd268aac613977f34a488" exitCode=0 Oct 09 20:01:04 crc kubenswrapper[4907]: I1009 20:01:04.073677 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334001-d7wk2" event={"ID":"e44476e3-d4b7-4c73-a478-9860df9f1d22","Type":"ContainerDied","Data":"c5febb4d1d8aedeb425089cd07ccf6acc2e91e9eab1dd268aac613977f34a488"} Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.430951 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.615907 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-fernet-keys\") pod \"e44476e3-d4b7-4c73-a478-9860df9f1d22\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.616043 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-combined-ca-bundle\") pod \"e44476e3-d4b7-4c73-a478-9860df9f1d22\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.616235 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w55xf\" (UniqueName: \"kubernetes.io/projected/e44476e3-d4b7-4c73-a478-9860df9f1d22-kube-api-access-w55xf\") pod \"e44476e3-d4b7-4c73-a478-9860df9f1d22\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.616291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-config-data\") pod \"e44476e3-d4b7-4c73-a478-9860df9f1d22\" (UID: \"e44476e3-d4b7-4c73-a478-9860df9f1d22\") " Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.623916 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44476e3-d4b7-4c73-a478-9860df9f1d22-kube-api-access-w55xf" (OuterVolumeSpecName: "kube-api-access-w55xf") pod "e44476e3-d4b7-4c73-a478-9860df9f1d22" (UID: "e44476e3-d4b7-4c73-a478-9860df9f1d22"). InnerVolumeSpecName "kube-api-access-w55xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.626280 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e44476e3-d4b7-4c73-a478-9860df9f1d22" (UID: "e44476e3-d4b7-4c73-a478-9860df9f1d22"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.670901 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e44476e3-d4b7-4c73-a478-9860df9f1d22" (UID: "e44476e3-d4b7-4c73-a478-9860df9f1d22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.694433 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-config-data" (OuterVolumeSpecName: "config-data") pod "e44476e3-d4b7-4c73-a478-9860df9f1d22" (UID: "e44476e3-d4b7-4c73-a478-9860df9f1d22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.718534 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.718854 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w55xf\" (UniqueName: \"kubernetes.io/projected/e44476e3-d4b7-4c73-a478-9860df9f1d22-kube-api-access-w55xf\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.718866 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:05 crc kubenswrapper[4907]: I1009 20:01:05.718874 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e44476e3-d4b7-4c73-a478-9860df9f1d22-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:06 crc kubenswrapper[4907]: I1009 20:01:06.100094 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334001-d7wk2" event={"ID":"e44476e3-d4b7-4c73-a478-9860df9f1d22","Type":"ContainerDied","Data":"c52089e6ba43867cdaaf71f2f88b55546cde33c7b47bce0f071e8b7468a6bf6e"} Oct 09 20:01:06 crc kubenswrapper[4907]: I1009 20:01:06.100361 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c52089e6ba43867cdaaf71f2f88b55546cde33c7b47bce0f071e8b7468a6bf6e" Oct 09 20:01:06 crc kubenswrapper[4907]: I1009 20:01:06.100589 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334001-d7wk2" Oct 09 20:01:18 crc kubenswrapper[4907]: I1009 20:01:18.222240 4907 generic.go:334] "Generic (PLEG): container finished" podID="77c2e1e4-f07b-4a72-b68d-661856abd621" containerID="60fc9e5f601894577df2133fdcf04dad2ff834b077d629f935da677e86d6671f" exitCode=0 Oct 09 20:01:18 crc kubenswrapper[4907]: I1009 20:01:18.222345 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" event={"ID":"77c2e1e4-f07b-4a72-b68d-661856abd621","Type":"ContainerDied","Data":"60fc9e5f601894577df2133fdcf04dad2ff834b077d629f935da677e86d6671f"} Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.648560 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.831852 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-ovn-default-certs-0\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.831958 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.832087 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-inventory\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.832175 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-libvirt-combined-ca-bundle\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.832377 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-neutron-metadata-combined-ca-bundle\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.832542 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-telemetry-combined-ca-bundle\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.832700 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ovn-combined-ca-bundle\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.832830 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-bootstrap-combined-ca-bundle\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.832925 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-repo-setup-combined-ca-bundle\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.833049 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-nova-combined-ca-bundle\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.833156 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.833270 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ssh-key\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.833316 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.833388 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cvw9\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-kube-api-access-6cvw9\") pod \"77c2e1e4-f07b-4a72-b68d-661856abd621\" (UID: \"77c2e1e4-f07b-4a72-b68d-661856abd621\") " Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.839529 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.841033 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.841158 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.843139 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.843260 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.843900 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.844459 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.845454 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.847617 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.851644 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-kube-api-access-6cvw9" (OuterVolumeSpecName: "kube-api-access-6cvw9") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "kube-api-access-6cvw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.851652 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.852901 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.877205 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-inventory" (OuterVolumeSpecName: "inventory") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.895129 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "77c2e1e4-f07b-4a72-b68d-661856abd621" (UID: "77c2e1e4-f07b-4a72-b68d-661856abd621"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935176 4907 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935203 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935214 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935223 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935233 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cvw9\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-kube-api-access-6cvw9\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935245 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935254 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77c2e1e4-f07b-4a72-b68d-661856abd621-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935264 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935272 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935281 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935289 4907 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935297 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935306 4907 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:19 crc kubenswrapper[4907]: I1009 20:01:19.935314 4907 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77c2e1e4-f07b-4a72-b68d-661856abd621-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.244978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" event={"ID":"77c2e1e4-f07b-4a72-b68d-661856abd621","Type":"ContainerDied","Data":"edc8e9211ecb9feb7c1adb771dc5236bae8592c0d7612bf7c903843d1f62dbf9"} Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.245019 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edc8e9211ecb9feb7c1adb771dc5236bae8592c0d7612bf7c903843d1f62dbf9" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.245100 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-rf68v" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.359050 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk"] Oct 09 20:01:20 crc kubenswrapper[4907]: E1009 20:01:20.359511 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e44476e3-d4b7-4c73-a478-9860df9f1d22" containerName="keystone-cron" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.359533 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e44476e3-d4b7-4c73-a478-9860df9f1d22" containerName="keystone-cron" Oct 09 20:01:20 crc kubenswrapper[4907]: E1009 20:01:20.359543 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c2e1e4-f07b-4a72-b68d-661856abd621" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.359551 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c2e1e4-f07b-4a72-b68d-661856abd621" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.359760 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c2e1e4-f07b-4a72-b68d-661856abd621" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.359780 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e44476e3-d4b7-4c73-a478-9860df9f1d22" containerName="keystone-cron" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.360408 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.363379 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.363531 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.363992 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.364319 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.364499 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.372318 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk"] Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.562439 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9csj\" (UniqueName: \"kubernetes.io/projected/6a6607e9-2440-4d12-8649-28e484f86815-kube-api-access-n9csj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.562544 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.562600 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.562752 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a6607e9-2440-4d12-8649-28e484f86815-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.562928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.664320 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.664666 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9csj\" (UniqueName: \"kubernetes.io/projected/6a6607e9-2440-4d12-8649-28e484f86815-kube-api-access-n9csj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.664708 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.664740 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.664790 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a6607e9-2440-4d12-8649-28e484f86815-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.665699 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a6607e9-2440-4d12-8649-28e484f86815-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.669271 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.669390 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.674173 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.684684 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9csj\" (UniqueName: \"kubernetes.io/projected/6a6607e9-2440-4d12-8649-28e484f86815-kube-api-access-n9csj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nxtwk\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:20 crc kubenswrapper[4907]: I1009 20:01:20.979172 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:01:21 crc kubenswrapper[4907]: I1009 20:01:21.574771 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk"] Oct 09 20:01:22 crc kubenswrapper[4907]: I1009 20:01:22.270220 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" event={"ID":"6a6607e9-2440-4d12-8649-28e484f86815","Type":"ContainerStarted","Data":"671ef439144a1a799122307a147972e3fb8f33a5023805d53c31fbf3cbe06837"} Oct 09 20:01:23 crc kubenswrapper[4907]: I1009 20:01:23.292974 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" event={"ID":"6a6607e9-2440-4d12-8649-28e484f86815","Type":"ContainerStarted","Data":"f4bed700074028b31a4bff1f7685dbcc361ab1ab7eca0d54c051f591f14c2b71"} Oct 09 20:01:23 crc kubenswrapper[4907]: I1009 20:01:23.326678 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" podStartSLOduration=2.8417230030000002 podStartE2EDuration="3.326590354s" podCreationTimestamp="2025-10-09 20:01:20 +0000 UTC" firstStartedPulling="2025-10-09 20:01:21.585300463 +0000 UTC m=+1967.117267952" lastFinishedPulling="2025-10-09 20:01:22.070167804 +0000 UTC m=+1967.602135303" observedRunningTime="2025-10-09 20:01:23.312293511 +0000 UTC m=+1968.844261010" watchObservedRunningTime="2025-10-09 20:01:23.326590354 +0000 UTC m=+1968.858557893" Oct 09 20:01:36 crc kubenswrapper[4907]: I1009 20:01:36.299434 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:01:36 crc kubenswrapper[4907]: I1009 20:01:36.299994 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:02:06 crc kubenswrapper[4907]: I1009 20:02:06.299515 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:02:06 crc kubenswrapper[4907]: I1009 20:02:06.300086 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:02:27 crc kubenswrapper[4907]: E1009 20:02:27.611963 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a6607e9_2440_4d12_8649_28e484f86815.slice/crio-f4bed700074028b31a4bff1f7685dbcc361ab1ab7eca0d54c051f591f14c2b71.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a6607e9_2440_4d12_8649_28e484f86815.slice/crio-conmon-f4bed700074028b31a4bff1f7685dbcc361ab1ab7eca0d54c051f591f14c2b71.scope\": RecentStats: unable to find data in memory cache]" Oct 09 20:02:28 crc kubenswrapper[4907]: I1009 20:02:28.026789 4907 generic.go:334] "Generic (PLEG): container finished" podID="6a6607e9-2440-4d12-8649-28e484f86815" containerID="f4bed700074028b31a4bff1f7685dbcc361ab1ab7eca0d54c051f591f14c2b71" exitCode=0 Oct 09 20:02:28 crc kubenswrapper[4907]: I1009 20:02:28.026851 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" event={"ID":"6a6607e9-2440-4d12-8649-28e484f86815","Type":"ContainerDied","Data":"f4bed700074028b31a4bff1f7685dbcc361ab1ab7eca0d54c051f591f14c2b71"} Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.493350 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.640515 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ssh-key\") pod \"6a6607e9-2440-4d12-8649-28e484f86815\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.640953 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a6607e9-2440-4d12-8649-28e484f86815-ovncontroller-config-0\") pod \"6a6607e9-2440-4d12-8649-28e484f86815\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.641029 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9csj\" (UniqueName: \"kubernetes.io/projected/6a6607e9-2440-4d12-8649-28e484f86815-kube-api-access-n9csj\") pod \"6a6607e9-2440-4d12-8649-28e484f86815\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.641100 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ovn-combined-ca-bundle\") pod \"6a6607e9-2440-4d12-8649-28e484f86815\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.641419 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-inventory\") pod \"6a6607e9-2440-4d12-8649-28e484f86815\" (UID: \"6a6607e9-2440-4d12-8649-28e484f86815\") " Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.649398 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6a6607e9-2440-4d12-8649-28e484f86815" (UID: "6a6607e9-2440-4d12-8649-28e484f86815"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.650450 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6607e9-2440-4d12-8649-28e484f86815-kube-api-access-n9csj" (OuterVolumeSpecName: "kube-api-access-n9csj") pod "6a6607e9-2440-4d12-8649-28e484f86815" (UID: "6a6607e9-2440-4d12-8649-28e484f86815"). InnerVolumeSpecName "kube-api-access-n9csj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.685317 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6a6607e9-2440-4d12-8649-28e484f86815" (UID: "6a6607e9-2440-4d12-8649-28e484f86815"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.700447 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-inventory" (OuterVolumeSpecName: "inventory") pod "6a6607e9-2440-4d12-8649-28e484f86815" (UID: "6a6607e9-2440-4d12-8649-28e484f86815"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.700626 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a6607e9-2440-4d12-8649-28e484f86815-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6a6607e9-2440-4d12-8649-28e484f86815" (UID: "6a6607e9-2440-4d12-8649-28e484f86815"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.744653 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.744706 4907 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6a6607e9-2440-4d12-8649-28e484f86815-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.744731 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9csj\" (UniqueName: \"kubernetes.io/projected/6a6607e9-2440-4d12-8649-28e484f86815-kube-api-access-n9csj\") on node \"crc\" DevicePath \"\"" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.744751 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:02:29 crc kubenswrapper[4907]: I1009 20:02:29.744774 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a6607e9-2440-4d12-8649-28e484f86815-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.051551 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" event={"ID":"6a6607e9-2440-4d12-8649-28e484f86815","Type":"ContainerDied","Data":"671ef439144a1a799122307a147972e3fb8f33a5023805d53c31fbf3cbe06837"} Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.051599 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671ef439144a1a799122307a147972e3fb8f33a5023805d53c31fbf3cbe06837" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.051676 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nxtwk" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.178837 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp"] Oct 09 20:02:30 crc kubenswrapper[4907]: E1009 20:02:30.179375 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6607e9-2440-4d12-8649-28e484f86815" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.179396 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6607e9-2440-4d12-8649-28e484f86815" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.179787 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6607e9-2440-4d12-8649-28e484f86815" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.180707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.183507 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.184563 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.190236 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.190516 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.190686 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.191854 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.195425 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp"] Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.360885 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.360963 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d6jx\" (UniqueName: \"kubernetes.io/projected/1830993a-457e-4730-a805-fa14152f2824-kube-api-access-8d6jx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.361063 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.361131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.361212 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.361243 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.462683 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.462737 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.462857 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.462887 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d6jx\" (UniqueName: \"kubernetes.io/projected/1830993a-457e-4730-a805-fa14152f2824-kube-api-access-8d6jx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.462964 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.462997 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.466225 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.466778 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.467659 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.467826 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.469049 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.488030 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d6jx\" (UniqueName: \"kubernetes.io/projected/1830993a-457e-4730-a805-fa14152f2824-kube-api-access-8d6jx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:30 crc kubenswrapper[4907]: I1009 20:02:30.516050 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:02:31 crc kubenswrapper[4907]: I1009 20:02:31.104409 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp"] Oct 09 20:02:32 crc kubenswrapper[4907]: I1009 20:02:32.068311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" event={"ID":"1830993a-457e-4730-a805-fa14152f2824","Type":"ContainerStarted","Data":"43cd514f19a672da1bb2804b8aee166f67adea8b455e4e7e6723df9229c22c91"} Oct 09 20:02:32 crc kubenswrapper[4907]: I1009 20:02:32.068629 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" event={"ID":"1830993a-457e-4730-a805-fa14152f2824","Type":"ContainerStarted","Data":"9ad3d67f0d4ae8f9da43264a3f1507265fdce69ab2db9f9438a7ca5132d7991c"} Oct 09 20:02:32 crc kubenswrapper[4907]: I1009 20:02:32.090320 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" podStartSLOduration=1.689130897 podStartE2EDuration="2.090301326s" podCreationTimestamp="2025-10-09 20:02:30 +0000 UTC" firstStartedPulling="2025-10-09 20:02:31.130399113 +0000 UTC m=+2036.662366602" lastFinishedPulling="2025-10-09 20:02:31.531569542 +0000 UTC m=+2037.063537031" observedRunningTime="2025-10-09 20:02:32.081524343 +0000 UTC m=+2037.613491852" watchObservedRunningTime="2025-10-09 20:02:32.090301326 +0000 UTC m=+2037.622268815" Oct 09 20:02:36 crc kubenswrapper[4907]: I1009 20:02:36.298816 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:02:36 crc kubenswrapper[4907]: I1009 20:02:36.299141 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:02:36 crc kubenswrapper[4907]: I1009 20:02:36.299182 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 20:02:36 crc kubenswrapper[4907]: I1009 20:02:36.300084 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec394429780b5e55358aca0ac686bbdd764b46de0ddd59d4a354bb6ae732345d"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 20:02:36 crc kubenswrapper[4907]: I1009 20:02:36.300142 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://ec394429780b5e55358aca0ac686bbdd764b46de0ddd59d4a354bb6ae732345d" gracePeriod=600 Oct 09 20:02:37 crc kubenswrapper[4907]: I1009 20:02:37.124615 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="ec394429780b5e55358aca0ac686bbdd764b46de0ddd59d4a354bb6ae732345d" exitCode=0 Oct 09 20:02:37 crc kubenswrapper[4907]: I1009 20:02:37.124691 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"ec394429780b5e55358aca0ac686bbdd764b46de0ddd59d4a354bb6ae732345d"} Oct 09 20:02:37 crc kubenswrapper[4907]: I1009 20:02:37.125241 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7"} Oct 09 20:02:37 crc kubenswrapper[4907]: I1009 20:02:37.125275 4907 scope.go:117] "RemoveContainer" containerID="37a2742575c8b7ee19b4b74c3cf76eada8dc61a0b622c51382bcfa44e0dda879" Oct 09 20:03:23 crc kubenswrapper[4907]: I1009 20:03:23.647569 4907 generic.go:334] "Generic (PLEG): container finished" podID="1830993a-457e-4730-a805-fa14152f2824" containerID="43cd514f19a672da1bb2804b8aee166f67adea8b455e4e7e6723df9229c22c91" exitCode=0 Oct 09 20:03:23 crc kubenswrapper[4907]: I1009 20:03:23.648068 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" event={"ID":"1830993a-457e-4730-a805-fa14152f2824","Type":"ContainerDied","Data":"43cd514f19a672da1bb2804b8aee166f67adea8b455e4e7e6723df9229c22c91"} Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.102719 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.197974 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1830993a-457e-4730-a805-fa14152f2824\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.198061 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-metadata-combined-ca-bundle\") pod \"1830993a-457e-4730-a805-fa14152f2824\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.198210 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-nova-metadata-neutron-config-0\") pod \"1830993a-457e-4730-a805-fa14152f2824\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.198265 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d6jx\" (UniqueName: \"kubernetes.io/projected/1830993a-457e-4730-a805-fa14152f2824-kube-api-access-8d6jx\") pod \"1830993a-457e-4730-a805-fa14152f2824\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.198325 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-inventory\") pod \"1830993a-457e-4730-a805-fa14152f2824\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.198342 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-ssh-key\") pod \"1830993a-457e-4730-a805-fa14152f2824\" (UID: \"1830993a-457e-4730-a805-fa14152f2824\") " Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.206811 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1830993a-457e-4730-a805-fa14152f2824" (UID: "1830993a-457e-4730-a805-fa14152f2824"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.219462 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1830993a-457e-4730-a805-fa14152f2824-kube-api-access-8d6jx" (OuterVolumeSpecName: "kube-api-access-8d6jx") pod "1830993a-457e-4730-a805-fa14152f2824" (UID: "1830993a-457e-4730-a805-fa14152f2824"). InnerVolumeSpecName "kube-api-access-8d6jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.231415 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1830993a-457e-4730-a805-fa14152f2824" (UID: "1830993a-457e-4730-a805-fa14152f2824"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.236610 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1830993a-457e-4730-a805-fa14152f2824" (UID: "1830993a-457e-4730-a805-fa14152f2824"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.240121 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-inventory" (OuterVolumeSpecName: "inventory") pod "1830993a-457e-4730-a805-fa14152f2824" (UID: "1830993a-457e-4730-a805-fa14152f2824"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.247348 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1830993a-457e-4730-a805-fa14152f2824" (UID: "1830993a-457e-4730-a805-fa14152f2824"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.301858 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.302191 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d6jx\" (UniqueName: \"kubernetes.io/projected/1830993a-457e-4730-a805-fa14152f2824-kube-api-access-8d6jx\") on node \"crc\" DevicePath \"\"" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.302205 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.302214 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.302225 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.302235 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1830993a-457e-4730-a805-fa14152f2824-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.681239 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" event={"ID":"1830993a-457e-4730-a805-fa14152f2824","Type":"ContainerDied","Data":"9ad3d67f0d4ae8f9da43264a3f1507265fdce69ab2db9f9438a7ca5132d7991c"} Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.681909 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad3d67f0d4ae8f9da43264a3f1507265fdce69ab2db9f9438a7ca5132d7991c" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.681313 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.835834 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6"] Oct 09 20:03:25 crc kubenswrapper[4907]: E1009 20:03:25.836348 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1830993a-457e-4730-a805-fa14152f2824" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.836371 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1830993a-457e-4730-a805-fa14152f2824" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.836636 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1830993a-457e-4730-a805-fa14152f2824" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.837304 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.839443 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.839618 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.839762 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.839867 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.842867 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:03:25 crc kubenswrapper[4907]: I1009 20:03:25.847777 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6"] Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.022995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.023071 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.023105 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2nj4\" (UniqueName: \"kubernetes.io/projected/313461ee-e16e-42e8-97ef-5e2d16f23cb5-kube-api-access-q2nj4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.023554 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.023722 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.125350 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.125434 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.125497 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.125524 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2nj4\" (UniqueName: \"kubernetes.io/projected/313461ee-e16e-42e8-97ef-5e2d16f23cb5-kube-api-access-q2nj4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.125609 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.130599 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.131164 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.131828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.132622 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.142689 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2nj4\" (UniqueName: \"kubernetes.io/projected/313461ee-e16e-42e8-97ef-5e2d16f23cb5-kube-api-access-q2nj4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.155007 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:03:26 crc kubenswrapper[4907]: I1009 20:03:26.699495 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6"] Oct 09 20:03:27 crc kubenswrapper[4907]: I1009 20:03:27.707863 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" event={"ID":"313461ee-e16e-42e8-97ef-5e2d16f23cb5","Type":"ContainerStarted","Data":"45c6313c74d59859db3a71d1660a8327c0a3d7b831d85f3b2cf57b8aa17ce7b9"} Oct 09 20:03:28 crc kubenswrapper[4907]: I1009 20:03:28.725493 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" event={"ID":"313461ee-e16e-42e8-97ef-5e2d16f23cb5","Type":"ContainerStarted","Data":"a53d0533fc7889ce06a23cfb44eb04f20fbfed9704fb67f02308f43bd123bcf8"} Oct 09 20:03:28 crc kubenswrapper[4907]: I1009 20:03:28.761634 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" podStartSLOduration=3.045252974 podStartE2EDuration="3.761613424s" podCreationTimestamp="2025-10-09 20:03:25 +0000 UTC" firstStartedPulling="2025-10-09 20:03:26.700635379 +0000 UTC m=+2092.232602898" lastFinishedPulling="2025-10-09 20:03:27.416995809 +0000 UTC m=+2092.948963348" observedRunningTime="2025-10-09 20:03:28.754135764 +0000 UTC m=+2094.286103323" watchObservedRunningTime="2025-10-09 20:03:28.761613424 +0000 UTC m=+2094.293580943" Oct 09 20:04:36 crc kubenswrapper[4907]: I1009 20:04:36.298995 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:04:36 crc kubenswrapper[4907]: I1009 20:04:36.299512 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:05:06 crc kubenswrapper[4907]: I1009 20:05:06.299563 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:05:06 crc kubenswrapper[4907]: I1009 20:05:06.300300 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:05:36 crc kubenswrapper[4907]: I1009 20:05:36.299634 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:05:36 crc kubenswrapper[4907]: I1009 20:05:36.300217 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:05:36 crc kubenswrapper[4907]: I1009 20:05:36.300275 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 20:05:36 crc kubenswrapper[4907]: I1009 20:05:36.301240 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 20:05:36 crc kubenswrapper[4907]: I1009 20:05:36.301325 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" gracePeriod=600 Oct 09 20:05:36 crc kubenswrapper[4907]: E1009 20:05:36.432155 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:05:37 crc kubenswrapper[4907]: I1009 20:05:37.069403 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" exitCode=0 Oct 09 20:05:37 crc kubenswrapper[4907]: I1009 20:05:37.069454 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7"} Oct 09 20:05:37 crc kubenswrapper[4907]: I1009 20:05:37.069515 4907 scope.go:117] "RemoveContainer" containerID="ec394429780b5e55358aca0ac686bbdd764b46de0ddd59d4a354bb6ae732345d" Oct 09 20:05:37 crc kubenswrapper[4907]: I1009 20:05:37.070434 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:05:37 crc kubenswrapper[4907]: E1009 20:05:37.070793 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:05:48 crc kubenswrapper[4907]: I1009 20:05:48.151372 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:05:48 crc kubenswrapper[4907]: E1009 20:05:48.152265 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.385684 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-96q77"] Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.388501 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.401762 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96q77"] Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.453354 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-utilities\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.453710 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-catalog-content\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.453835 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bxx\" (UniqueName: \"kubernetes.io/projected/39ea1abe-230c-4359-bef1-837592554d64-kube-api-access-l5bxx\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.555482 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-catalog-content\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.555549 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bxx\" (UniqueName: \"kubernetes.io/projected/39ea1abe-230c-4359-bef1-837592554d64-kube-api-access-l5bxx\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.555695 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-utilities\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.556154 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-utilities\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.556162 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-catalog-content\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.578155 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bxx\" (UniqueName: \"kubernetes.io/projected/39ea1abe-230c-4359-bef1-837592554d64-kube-api-access-l5bxx\") pod \"certified-operators-96q77\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:51 crc kubenswrapper[4907]: I1009 20:05:51.717386 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:05:52 crc kubenswrapper[4907]: I1009 20:05:52.006276 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96q77"] Oct 09 20:05:52 crc kubenswrapper[4907]: I1009 20:05:52.246906 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96q77" event={"ID":"39ea1abe-230c-4359-bef1-837592554d64","Type":"ContainerStarted","Data":"2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b"} Oct 09 20:05:52 crc kubenswrapper[4907]: I1009 20:05:52.247167 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96q77" event={"ID":"39ea1abe-230c-4359-bef1-837592554d64","Type":"ContainerStarted","Data":"84e40c7e7de8bde909b02f3bdbd39829efa9e27bc38df83ec2f83aef60dc9cba"} Oct 09 20:05:53 crc kubenswrapper[4907]: I1009 20:05:53.259861 4907 generic.go:334] "Generic (PLEG): container finished" podID="39ea1abe-230c-4359-bef1-837592554d64" containerID="2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b" exitCode=0 Oct 09 20:05:53 crc kubenswrapper[4907]: I1009 20:05:53.259959 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96q77" event={"ID":"39ea1abe-230c-4359-bef1-837592554d64","Type":"ContainerDied","Data":"2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b"} Oct 09 20:05:53 crc kubenswrapper[4907]: I1009 20:05:53.263971 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 20:05:55 crc kubenswrapper[4907]: I1009 20:05:55.301522 4907 generic.go:334] "Generic (PLEG): container finished" podID="39ea1abe-230c-4359-bef1-837592554d64" containerID="a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e" exitCode=0 Oct 09 20:05:55 crc kubenswrapper[4907]: I1009 20:05:55.301758 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96q77" event={"ID":"39ea1abe-230c-4359-bef1-837592554d64","Type":"ContainerDied","Data":"a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e"} Oct 09 20:05:56 crc kubenswrapper[4907]: I1009 20:05:56.317066 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96q77" event={"ID":"39ea1abe-230c-4359-bef1-837592554d64","Type":"ContainerStarted","Data":"4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a"} Oct 09 20:05:56 crc kubenswrapper[4907]: I1009 20:05:56.343672 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-96q77" podStartSLOduration=2.711798239 podStartE2EDuration="5.343649286s" podCreationTimestamp="2025-10-09 20:05:51 +0000 UTC" firstStartedPulling="2025-10-09 20:05:53.263730252 +0000 UTC m=+2238.795697741" lastFinishedPulling="2025-10-09 20:05:55.895581259 +0000 UTC m=+2241.427548788" observedRunningTime="2025-10-09 20:05:56.340808453 +0000 UTC m=+2241.872775962" watchObservedRunningTime="2025-10-09 20:05:56.343649286 +0000 UTC m=+2241.875616775" Oct 09 20:05:59 crc kubenswrapper[4907]: I1009 20:05:59.152190 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:05:59 crc kubenswrapper[4907]: E1009 20:05:59.152824 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:06:01 crc kubenswrapper[4907]: I1009 20:06:01.727381 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:06:01 crc kubenswrapper[4907]: I1009 20:06:01.728045 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:06:01 crc kubenswrapper[4907]: I1009 20:06:01.818299 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:06:02 crc kubenswrapper[4907]: I1009 20:06:02.475563 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:06:02 crc kubenswrapper[4907]: I1009 20:06:02.530305 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-96q77"] Oct 09 20:06:04 crc kubenswrapper[4907]: I1009 20:06:04.433575 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-96q77" podUID="39ea1abe-230c-4359-bef1-837592554d64" containerName="registry-server" containerID="cri-o://4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a" gracePeriod=2 Oct 09 20:06:04 crc kubenswrapper[4907]: I1009 20:06:04.920061 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:06:04 crc kubenswrapper[4907]: I1009 20:06:04.980608 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bxx\" (UniqueName: \"kubernetes.io/projected/39ea1abe-230c-4359-bef1-837592554d64-kube-api-access-l5bxx\") pod \"39ea1abe-230c-4359-bef1-837592554d64\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " Oct 09 20:06:04 crc kubenswrapper[4907]: I1009 20:06:04.980852 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-utilities\") pod \"39ea1abe-230c-4359-bef1-837592554d64\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " Oct 09 20:06:04 crc kubenswrapper[4907]: I1009 20:06:04.980958 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-catalog-content\") pod \"39ea1abe-230c-4359-bef1-837592554d64\" (UID: \"39ea1abe-230c-4359-bef1-837592554d64\") " Oct 09 20:06:04 crc kubenswrapper[4907]: I1009 20:06:04.982104 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-utilities" (OuterVolumeSpecName: "utilities") pod "39ea1abe-230c-4359-bef1-837592554d64" (UID: "39ea1abe-230c-4359-bef1-837592554d64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:06:04 crc kubenswrapper[4907]: I1009 20:06:04.987664 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ea1abe-230c-4359-bef1-837592554d64-kube-api-access-l5bxx" (OuterVolumeSpecName: "kube-api-access-l5bxx") pod "39ea1abe-230c-4359-bef1-837592554d64" (UID: "39ea1abe-230c-4359-bef1-837592554d64"). InnerVolumeSpecName "kube-api-access-l5bxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.029694 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39ea1abe-230c-4359-bef1-837592554d64" (UID: "39ea1abe-230c-4359-bef1-837592554d64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.083333 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.083378 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ea1abe-230c-4359-bef1-837592554d64-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.083390 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5bxx\" (UniqueName: \"kubernetes.io/projected/39ea1abe-230c-4359-bef1-837592554d64-kube-api-access-l5bxx\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.446520 4907 generic.go:334] "Generic (PLEG): container finished" podID="39ea1abe-230c-4359-bef1-837592554d64" containerID="4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a" exitCode=0 Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.446610 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96q77" event={"ID":"39ea1abe-230c-4359-bef1-837592554d64","Type":"ContainerDied","Data":"4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a"} Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.446668 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96q77" event={"ID":"39ea1abe-230c-4359-bef1-837592554d64","Type":"ContainerDied","Data":"84e40c7e7de8bde909b02f3bdbd39829efa9e27bc38df83ec2f83aef60dc9cba"} Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.446687 4907 scope.go:117] "RemoveContainer" containerID="4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.447354 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96q77" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.478812 4907 scope.go:117] "RemoveContainer" containerID="a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.494589 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-96q77"] Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.508374 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-96q77"] Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.523888 4907 scope.go:117] "RemoveContainer" containerID="2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.563045 4907 scope.go:117] "RemoveContainer" containerID="4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a" Oct 09 20:06:05 crc kubenswrapper[4907]: E1009 20:06:05.563820 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a\": container with ID starting with 4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a not found: ID does not exist" containerID="4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.563867 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a"} err="failed to get container status \"4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a\": rpc error: code = NotFound desc = could not find container \"4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a\": container with ID starting with 4cecd5c56027861273d1224500183c90fc47a2d41533a7094793bf92b6b5601a not found: ID does not exist" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.563900 4907 scope.go:117] "RemoveContainer" containerID="a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e" Oct 09 20:06:05 crc kubenswrapper[4907]: E1009 20:06:05.564297 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e\": container with ID starting with a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e not found: ID does not exist" containerID="a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.564321 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e"} err="failed to get container status \"a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e\": rpc error: code = NotFound desc = could not find container \"a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e\": container with ID starting with a38cb0fcf0bea0a8232ec850d733c5615247037136d676a5c4e644e8fc96665e not found: ID does not exist" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.564339 4907 scope.go:117] "RemoveContainer" containerID="2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b" Oct 09 20:06:05 crc kubenswrapper[4907]: E1009 20:06:05.564825 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b\": container with ID starting with 2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b not found: ID does not exist" containerID="2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b" Oct 09 20:06:05 crc kubenswrapper[4907]: I1009 20:06:05.564853 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b"} err="failed to get container status \"2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b\": rpc error: code = NotFound desc = could not find container \"2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b\": container with ID starting with 2b7211497074afb7247a7c077f3c86b4c67579638978fc8d19d8e4fac1690c4b not found: ID does not exist" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.171530 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ea1abe-230c-4359-bef1-837592554d64" path="/var/lib/kubelet/pods/39ea1abe-230c-4359-bef1-837592554d64/volumes" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.468240 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r6rww"] Oct 09 20:06:07 crc kubenswrapper[4907]: E1009 20:06:07.468734 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ea1abe-230c-4359-bef1-837592554d64" containerName="registry-server" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.468756 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ea1abe-230c-4359-bef1-837592554d64" containerName="registry-server" Oct 09 20:06:07 crc kubenswrapper[4907]: E1009 20:06:07.468807 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ea1abe-230c-4359-bef1-837592554d64" containerName="extract-utilities" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.468816 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ea1abe-230c-4359-bef1-837592554d64" containerName="extract-utilities" Oct 09 20:06:07 crc kubenswrapper[4907]: E1009 20:06:07.468841 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ea1abe-230c-4359-bef1-837592554d64" containerName="extract-content" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.468850 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ea1abe-230c-4359-bef1-837592554d64" containerName="extract-content" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.469079 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ea1abe-230c-4359-bef1-837592554d64" containerName="registry-server" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.471165 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.481995 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6rww"] Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.626707 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc4l4\" (UniqueName: \"kubernetes.io/projected/a1817094-077d-4697-9334-6afb3f51039f-kube-api-access-pc4l4\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.626911 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-utilities\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.627000 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-catalog-content\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.734369 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4l4\" (UniqueName: \"kubernetes.io/projected/a1817094-077d-4697-9334-6afb3f51039f-kube-api-access-pc4l4\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.734687 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-utilities\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.734836 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-catalog-content\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.735873 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-catalog-content\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.738888 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-utilities\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.769240 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4l4\" (UniqueName: \"kubernetes.io/projected/a1817094-077d-4697-9334-6afb3f51039f-kube-api-access-pc4l4\") pod \"redhat-operators-r6rww\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:07 crc kubenswrapper[4907]: I1009 20:06:07.804581 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:08 crc kubenswrapper[4907]: I1009 20:06:08.264522 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6rww"] Oct 09 20:06:08 crc kubenswrapper[4907]: W1009 20:06:08.276494 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1817094_077d_4697_9334_6afb3f51039f.slice/crio-d8208b17c5a5ed2097340ff4153dc6651a1bc1c9d60a74b4b83a1e9a7fc8dc87 WatchSource:0}: Error finding container d8208b17c5a5ed2097340ff4153dc6651a1bc1c9d60a74b4b83a1e9a7fc8dc87: Status 404 returned error can't find the container with id d8208b17c5a5ed2097340ff4153dc6651a1bc1c9d60a74b4b83a1e9a7fc8dc87 Oct 09 20:06:08 crc kubenswrapper[4907]: I1009 20:06:08.480544 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6rww" event={"ID":"a1817094-077d-4697-9334-6afb3f51039f","Type":"ContainerStarted","Data":"d8208b17c5a5ed2097340ff4153dc6651a1bc1c9d60a74b4b83a1e9a7fc8dc87"} Oct 09 20:06:09 crc kubenswrapper[4907]: I1009 20:06:09.492040 4907 generic.go:334] "Generic (PLEG): container finished" podID="a1817094-077d-4697-9334-6afb3f51039f" containerID="855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f" exitCode=0 Oct 09 20:06:09 crc kubenswrapper[4907]: I1009 20:06:09.492094 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6rww" event={"ID":"a1817094-077d-4697-9334-6afb3f51039f","Type":"ContainerDied","Data":"855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f"} Oct 09 20:06:10 crc kubenswrapper[4907]: I1009 20:06:10.505163 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6rww" event={"ID":"a1817094-077d-4697-9334-6afb3f51039f","Type":"ContainerStarted","Data":"e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a"} Oct 09 20:06:11 crc kubenswrapper[4907]: I1009 20:06:11.152271 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:06:11 crc kubenswrapper[4907]: E1009 20:06:11.152587 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:06:11 crc kubenswrapper[4907]: I1009 20:06:11.519766 4907 generic.go:334] "Generic (PLEG): container finished" podID="a1817094-077d-4697-9334-6afb3f51039f" containerID="e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a" exitCode=0 Oct 09 20:06:11 crc kubenswrapper[4907]: I1009 20:06:11.519835 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6rww" event={"ID":"a1817094-077d-4697-9334-6afb3f51039f","Type":"ContainerDied","Data":"e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a"} Oct 09 20:06:12 crc kubenswrapper[4907]: I1009 20:06:12.585387 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r6rww" podStartSLOduration=2.7569053610000003 podStartE2EDuration="5.585371362s" podCreationTimestamp="2025-10-09 20:06:07 +0000 UTC" firstStartedPulling="2025-10-09 20:06:09.494388514 +0000 UTC m=+2255.026356023" lastFinishedPulling="2025-10-09 20:06:12.322854505 +0000 UTC m=+2257.854822024" observedRunningTime="2025-10-09 20:06:12.583435819 +0000 UTC m=+2258.115403318" watchObservedRunningTime="2025-10-09 20:06:12.585371362 +0000 UTC m=+2258.117338851" Oct 09 20:06:13 crc kubenswrapper[4907]: I1009 20:06:13.575286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6rww" event={"ID":"a1817094-077d-4697-9334-6afb3f51039f","Type":"ContainerStarted","Data":"ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c"} Oct 09 20:06:17 crc kubenswrapper[4907]: I1009 20:06:17.806221 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:17 crc kubenswrapper[4907]: I1009 20:06:17.806874 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:17 crc kubenswrapper[4907]: I1009 20:06:17.860427 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:18 crc kubenswrapper[4907]: I1009 20:06:18.679770 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:18 crc kubenswrapper[4907]: I1009 20:06:18.747860 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6rww"] Oct 09 20:06:20 crc kubenswrapper[4907]: I1009 20:06:20.641443 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r6rww" podUID="a1817094-077d-4697-9334-6afb3f51039f" containerName="registry-server" containerID="cri-o://ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c" gracePeriod=2 Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.104255 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.208885 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-utilities\") pod \"a1817094-077d-4697-9334-6afb3f51039f\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.208947 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-catalog-content\") pod \"a1817094-077d-4697-9334-6afb3f51039f\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.209092 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc4l4\" (UniqueName: \"kubernetes.io/projected/a1817094-077d-4697-9334-6afb3f51039f-kube-api-access-pc4l4\") pod \"a1817094-077d-4697-9334-6afb3f51039f\" (UID: \"a1817094-077d-4697-9334-6afb3f51039f\") " Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.210231 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-utilities" (OuterVolumeSpecName: "utilities") pod "a1817094-077d-4697-9334-6afb3f51039f" (UID: "a1817094-077d-4697-9334-6afb3f51039f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.217279 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1817094-077d-4697-9334-6afb3f51039f-kube-api-access-pc4l4" (OuterVolumeSpecName: "kube-api-access-pc4l4") pod "a1817094-077d-4697-9334-6afb3f51039f" (UID: "a1817094-077d-4697-9334-6afb3f51039f"). InnerVolumeSpecName "kube-api-access-pc4l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.311282 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.311317 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc4l4\" (UniqueName: \"kubernetes.io/projected/a1817094-077d-4697-9334-6afb3f51039f-kube-api-access-pc4l4\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.657649 4907 generic.go:334] "Generic (PLEG): container finished" podID="a1817094-077d-4697-9334-6afb3f51039f" containerID="ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c" exitCode=0 Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.657774 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6rww" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.657786 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6rww" event={"ID":"a1817094-077d-4697-9334-6afb3f51039f","Type":"ContainerDied","Data":"ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c"} Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.658128 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6rww" event={"ID":"a1817094-077d-4697-9334-6afb3f51039f","Type":"ContainerDied","Data":"d8208b17c5a5ed2097340ff4153dc6651a1bc1c9d60a74b4b83a1e9a7fc8dc87"} Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.658162 4907 scope.go:117] "RemoveContainer" containerID="ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.693479 4907 scope.go:117] "RemoveContainer" containerID="e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.730532 4907 scope.go:117] "RemoveContainer" containerID="855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.789969 4907 scope.go:117] "RemoveContainer" containerID="ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c" Oct 09 20:06:21 crc kubenswrapper[4907]: E1009 20:06:21.790442 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c\": container with ID starting with ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c not found: ID does not exist" containerID="ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.790523 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c"} err="failed to get container status \"ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c\": rpc error: code = NotFound desc = could not find container \"ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c\": container with ID starting with ebcd6c79a68c5702df95f1c184c417fea8e926c0357f5ebb2b2b7261a26db57c not found: ID does not exist" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.790553 4907 scope.go:117] "RemoveContainer" containerID="e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a" Oct 09 20:06:21 crc kubenswrapper[4907]: E1009 20:06:21.791221 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a\": container with ID starting with e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a not found: ID does not exist" containerID="e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.791270 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a"} err="failed to get container status \"e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a\": rpc error: code = NotFound desc = could not find container \"e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a\": container with ID starting with e84e3a4a181c03f4803913e3fea96f6cf5a3a2538b27c20b50b915d3fa188f9a not found: ID does not exist" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.791295 4907 scope.go:117] "RemoveContainer" containerID="855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f" Oct 09 20:06:21 crc kubenswrapper[4907]: E1009 20:06:21.791791 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f\": container with ID starting with 855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f not found: ID does not exist" containerID="855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f" Oct 09 20:06:21 crc kubenswrapper[4907]: I1009 20:06:21.791836 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f"} err="failed to get container status \"855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f\": rpc error: code = NotFound desc = could not find container \"855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f\": container with ID starting with 855c4b1dc06c7351b1afb12a11f25176da74f9868012472356b6fb6ee127035f not found: ID does not exist" Oct 09 20:06:22 crc kubenswrapper[4907]: I1009 20:06:22.152062 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:06:22 crc kubenswrapper[4907]: E1009 20:06:22.152590 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:06:22 crc kubenswrapper[4907]: I1009 20:06:22.615954 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1817094-077d-4697-9334-6afb3f51039f" (UID: "a1817094-077d-4697-9334-6afb3f51039f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:06:22 crc kubenswrapper[4907]: I1009 20:06:22.637546 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1817094-077d-4697-9334-6afb3f51039f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:22 crc kubenswrapper[4907]: I1009 20:06:22.895346 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6rww"] Oct 09 20:06:22 crc kubenswrapper[4907]: I1009 20:06:22.903400 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r6rww"] Oct 09 20:06:23 crc kubenswrapper[4907]: I1009 20:06:23.167206 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1817094-077d-4697-9334-6afb3f51039f" path="/var/lib/kubelet/pods/a1817094-077d-4697-9334-6afb3f51039f/volumes" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.119382 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cfdr8"] Oct 09 20:06:34 crc kubenswrapper[4907]: E1009 20:06:34.120160 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1817094-077d-4697-9334-6afb3f51039f" containerName="extract-utilities" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.120171 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1817094-077d-4697-9334-6afb3f51039f" containerName="extract-utilities" Oct 09 20:06:34 crc kubenswrapper[4907]: E1009 20:06:34.120204 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1817094-077d-4697-9334-6afb3f51039f" containerName="extract-content" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.120212 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1817094-077d-4697-9334-6afb3f51039f" containerName="extract-content" Oct 09 20:06:34 crc kubenswrapper[4907]: E1009 20:06:34.120223 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1817094-077d-4697-9334-6afb3f51039f" containerName="registry-server" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.120229 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1817094-077d-4697-9334-6afb3f51039f" containerName="registry-server" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.120401 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1817094-077d-4697-9334-6afb3f51039f" containerName="registry-server" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.122124 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.134045 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cfdr8"] Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.257728 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-catalog-content\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.257783 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqlsr\" (UniqueName: \"kubernetes.io/projected/2f2a9664-e6e1-43ee-9b48-f9c220b90116-kube-api-access-qqlsr\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.257909 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-utilities\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.359309 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-utilities\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.359763 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-catalog-content\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.359785 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqlsr\" (UniqueName: \"kubernetes.io/projected/2f2a9664-e6e1-43ee-9b48-f9c220b90116-kube-api-access-qqlsr\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.359850 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-utilities\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.360169 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-catalog-content\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.381779 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqlsr\" (UniqueName: \"kubernetes.io/projected/2f2a9664-e6e1-43ee-9b48-f9c220b90116-kube-api-access-qqlsr\") pod \"redhat-marketplace-cfdr8\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.437915 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:34 crc kubenswrapper[4907]: I1009 20:06:34.930260 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cfdr8"] Oct 09 20:06:35 crc kubenswrapper[4907]: I1009 20:06:35.810104 4907 generic.go:334] "Generic (PLEG): container finished" podID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerID="ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406" exitCode=0 Oct 09 20:06:35 crc kubenswrapper[4907]: I1009 20:06:35.810174 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cfdr8" event={"ID":"2f2a9664-e6e1-43ee-9b48-f9c220b90116","Type":"ContainerDied","Data":"ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406"} Oct 09 20:06:35 crc kubenswrapper[4907]: I1009 20:06:35.810559 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cfdr8" event={"ID":"2f2a9664-e6e1-43ee-9b48-f9c220b90116","Type":"ContainerStarted","Data":"9dbdef54caa6fa518268401bea6d7faaa7c28e111ffd7aea0b912381ab2aafa6"} Oct 09 20:06:36 crc kubenswrapper[4907]: I1009 20:06:36.152002 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:06:36 crc kubenswrapper[4907]: E1009 20:06:36.152744 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:06:36 crc kubenswrapper[4907]: I1009 20:06:36.824022 4907 generic.go:334] "Generic (PLEG): container finished" podID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerID="21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed" exitCode=0 Oct 09 20:06:36 crc kubenswrapper[4907]: I1009 20:06:36.824103 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cfdr8" event={"ID":"2f2a9664-e6e1-43ee-9b48-f9c220b90116","Type":"ContainerDied","Data":"21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed"} Oct 09 20:06:37 crc kubenswrapper[4907]: I1009 20:06:37.837218 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cfdr8" event={"ID":"2f2a9664-e6e1-43ee-9b48-f9c220b90116","Type":"ContainerStarted","Data":"776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc"} Oct 09 20:06:37 crc kubenswrapper[4907]: I1009 20:06:37.861214 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cfdr8" podStartSLOduration=2.343663872 podStartE2EDuration="3.861195715s" podCreationTimestamp="2025-10-09 20:06:34 +0000 UTC" firstStartedPulling="2025-10-09 20:06:35.813018914 +0000 UTC m=+2281.344986453" lastFinishedPulling="2025-10-09 20:06:37.330550797 +0000 UTC m=+2282.862518296" observedRunningTime="2025-10-09 20:06:37.854522548 +0000 UTC m=+2283.386490057" watchObservedRunningTime="2025-10-09 20:06:37.861195715 +0000 UTC m=+2283.393163204" Oct 09 20:06:44 crc kubenswrapper[4907]: I1009 20:06:44.438265 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:44 crc kubenswrapper[4907]: I1009 20:06:44.438842 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:44 crc kubenswrapper[4907]: I1009 20:06:44.511528 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:44 crc kubenswrapper[4907]: E1009 20:06:44.552887 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2a9664_e6e1_43ee_9b48_f9c220b90116.slice/crio-conmon-ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406.scope\": RecentStats: unable to find data in memory cache]" Oct 09 20:06:45 crc kubenswrapper[4907]: I1009 20:06:45.005439 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:45 crc kubenswrapper[4907]: I1009 20:06:45.065807 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cfdr8"] Oct 09 20:06:46 crc kubenswrapper[4907]: I1009 20:06:46.964405 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cfdr8" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerName="registry-server" containerID="cri-o://776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc" gracePeriod=2 Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.152944 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:06:47 crc kubenswrapper[4907]: E1009 20:06:47.153490 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.416185 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.471413 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-catalog-content\") pod \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.471535 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqlsr\" (UniqueName: \"kubernetes.io/projected/2f2a9664-e6e1-43ee-9b48-f9c220b90116-kube-api-access-qqlsr\") pod \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.471622 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-utilities\") pod \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\" (UID: \"2f2a9664-e6e1-43ee-9b48-f9c220b90116\") " Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.473343 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-utilities" (OuterVolumeSpecName: "utilities") pod "2f2a9664-e6e1-43ee-9b48-f9c220b90116" (UID: "2f2a9664-e6e1-43ee-9b48-f9c220b90116"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.482210 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2a9664-e6e1-43ee-9b48-f9c220b90116-kube-api-access-qqlsr" (OuterVolumeSpecName: "kube-api-access-qqlsr") pod "2f2a9664-e6e1-43ee-9b48-f9c220b90116" (UID: "2f2a9664-e6e1-43ee-9b48-f9c220b90116"). InnerVolumeSpecName "kube-api-access-qqlsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.492012 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f2a9664-e6e1-43ee-9b48-f9c220b90116" (UID: "2f2a9664-e6e1-43ee-9b48-f9c220b90116"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.585941 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.586006 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqlsr\" (UniqueName: \"kubernetes.io/projected/2f2a9664-e6e1-43ee-9b48-f9c220b90116-kube-api-access-qqlsr\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.586038 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2a9664-e6e1-43ee-9b48-f9c220b90116-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.982648 4907 generic.go:334] "Generic (PLEG): container finished" podID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerID="776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc" exitCode=0 Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.982752 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cfdr8" Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.982790 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cfdr8" event={"ID":"2f2a9664-e6e1-43ee-9b48-f9c220b90116","Type":"ContainerDied","Data":"776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc"} Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.983303 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cfdr8" event={"ID":"2f2a9664-e6e1-43ee-9b48-f9c220b90116","Type":"ContainerDied","Data":"9dbdef54caa6fa518268401bea6d7faaa7c28e111ffd7aea0b912381ab2aafa6"} Oct 09 20:06:47 crc kubenswrapper[4907]: I1009 20:06:47.983352 4907 scope.go:117] "RemoveContainer" containerID="776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc" Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.040520 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cfdr8"] Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.042307 4907 scope.go:117] "RemoveContainer" containerID="21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed" Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.046144 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cfdr8"] Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.069825 4907 scope.go:117] "RemoveContainer" containerID="ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406" Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.108529 4907 scope.go:117] "RemoveContainer" containerID="776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc" Oct 09 20:06:48 crc kubenswrapper[4907]: E1009 20:06:48.109048 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc\": container with ID starting with 776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc not found: ID does not exist" containerID="776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc" Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.109082 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc"} err="failed to get container status \"776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc\": rpc error: code = NotFound desc = could not find container \"776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc\": container with ID starting with 776160afd212d1e4f1d7078cf3dfea2fb229713c13afe83ce23462799f2467cc not found: ID does not exist" Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.109102 4907 scope.go:117] "RemoveContainer" containerID="21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed" Oct 09 20:06:48 crc kubenswrapper[4907]: E1009 20:06:48.109548 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed\": container with ID starting with 21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed not found: ID does not exist" containerID="21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed" Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.109576 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed"} err="failed to get container status \"21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed\": rpc error: code = NotFound desc = could not find container \"21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed\": container with ID starting with 21935eff39d792aba77cea8539f73c4a2ac98c27d83549289628b239a769c2ed not found: ID does not exist" Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.109594 4907 scope.go:117] "RemoveContainer" containerID="ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406" Oct 09 20:06:48 crc kubenswrapper[4907]: E1009 20:06:48.109849 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406\": container with ID starting with ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406 not found: ID does not exist" containerID="ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406" Oct 09 20:06:48 crc kubenswrapper[4907]: I1009 20:06:48.109872 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406"} err="failed to get container status \"ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406\": rpc error: code = NotFound desc = could not find container \"ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406\": container with ID starting with ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406 not found: ID does not exist" Oct 09 20:06:49 crc kubenswrapper[4907]: I1009 20:06:49.171239 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" path="/var/lib/kubelet/pods/2f2a9664-e6e1-43ee-9b48-f9c220b90116/volumes" Oct 09 20:06:54 crc kubenswrapper[4907]: E1009 20:06:54.843636 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2a9664_e6e1_43ee_9b48_f9c220b90116.slice/crio-conmon-ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406.scope\": RecentStats: unable to find data in memory cache]" Oct 09 20:06:59 crc kubenswrapper[4907]: I1009 20:06:59.151433 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:06:59 crc kubenswrapper[4907]: E1009 20:06:59.152118 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:07:05 crc kubenswrapper[4907]: E1009 20:07:05.153137 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2a9664_e6e1_43ee_9b48_f9c220b90116.slice/crio-conmon-ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406.scope\": RecentStats: unable to find data in memory cache]" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.707078 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rxqv5"] Oct 09 20:07:07 crc kubenswrapper[4907]: E1009 20:07:07.707857 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerName="extract-content" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.707869 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerName="extract-content" Oct 09 20:07:07 crc kubenswrapper[4907]: E1009 20:07:07.707881 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerName="extract-utilities" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.707890 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerName="extract-utilities" Oct 09 20:07:07 crc kubenswrapper[4907]: E1009 20:07:07.707905 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerName="registry-server" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.707910 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerName="registry-server" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.708073 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2a9664-e6e1-43ee-9b48-f9c220b90116" containerName="registry-server" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.709397 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.722508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bwfv\" (UniqueName: \"kubernetes.io/projected/5afea26d-2592-4cc4-b659-4580dfeebbb5-kube-api-access-7bwfv\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.722759 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-catalog-content\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.722868 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-utilities\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.727916 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxqv5"] Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.824313 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bwfv\" (UniqueName: \"kubernetes.io/projected/5afea26d-2592-4cc4-b659-4580dfeebbb5-kube-api-access-7bwfv\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.824376 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-catalog-content\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.824398 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-utilities\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.824927 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-catalog-content\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.824977 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-utilities\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:07 crc kubenswrapper[4907]: I1009 20:07:07.856137 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bwfv\" (UniqueName: \"kubernetes.io/projected/5afea26d-2592-4cc4-b659-4580dfeebbb5-kube-api-access-7bwfv\") pod \"community-operators-rxqv5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:08 crc kubenswrapper[4907]: I1009 20:07:08.032981 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:08 crc kubenswrapper[4907]: I1009 20:07:08.543391 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxqv5"] Oct 09 20:07:09 crc kubenswrapper[4907]: I1009 20:07:09.227662 4907 generic.go:334] "Generic (PLEG): container finished" podID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerID="4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a" exitCode=0 Oct 09 20:07:09 crc kubenswrapper[4907]: I1009 20:07:09.227746 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxqv5" event={"ID":"5afea26d-2592-4cc4-b659-4580dfeebbb5","Type":"ContainerDied","Data":"4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a"} Oct 09 20:07:09 crc kubenswrapper[4907]: I1009 20:07:09.228122 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxqv5" event={"ID":"5afea26d-2592-4cc4-b659-4580dfeebbb5","Type":"ContainerStarted","Data":"bbb028afb7afecd5c838159ff0ecec1781b4f45460857113711c8e455e66c08a"} Oct 09 20:07:10 crc kubenswrapper[4907]: I1009 20:07:10.246808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxqv5" event={"ID":"5afea26d-2592-4cc4-b659-4580dfeebbb5","Type":"ContainerStarted","Data":"68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56"} Oct 09 20:07:11 crc kubenswrapper[4907]: I1009 20:07:11.259758 4907 generic.go:334] "Generic (PLEG): container finished" podID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerID="68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56" exitCode=0 Oct 09 20:07:11 crc kubenswrapper[4907]: I1009 20:07:11.260038 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxqv5" event={"ID":"5afea26d-2592-4cc4-b659-4580dfeebbb5","Type":"ContainerDied","Data":"68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56"} Oct 09 20:07:12 crc kubenswrapper[4907]: I1009 20:07:12.274638 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxqv5" event={"ID":"5afea26d-2592-4cc4-b659-4580dfeebbb5","Type":"ContainerStarted","Data":"32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427"} Oct 09 20:07:12 crc kubenswrapper[4907]: I1009 20:07:12.299758 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rxqv5" podStartSLOduration=2.824294576 podStartE2EDuration="5.29973763s" podCreationTimestamp="2025-10-09 20:07:07 +0000 UTC" firstStartedPulling="2025-10-09 20:07:09.236361245 +0000 UTC m=+2314.768328774" lastFinishedPulling="2025-10-09 20:07:11.711804329 +0000 UTC m=+2317.243771828" observedRunningTime="2025-10-09 20:07:12.292247941 +0000 UTC m=+2317.824215460" watchObservedRunningTime="2025-10-09 20:07:12.29973763 +0000 UTC m=+2317.831705139" Oct 09 20:07:13 crc kubenswrapper[4907]: I1009 20:07:13.152341 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:07:13 crc kubenswrapper[4907]: E1009 20:07:13.152653 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:07:15 crc kubenswrapper[4907]: E1009 20:07:15.399605 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2a9664_e6e1_43ee_9b48_f9c220b90116.slice/crio-conmon-ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406.scope\": RecentStats: unable to find data in memory cache]" Oct 09 20:07:18 crc kubenswrapper[4907]: I1009 20:07:18.033363 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:18 crc kubenswrapper[4907]: I1009 20:07:18.033926 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:18 crc kubenswrapper[4907]: I1009 20:07:18.087167 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:18 crc kubenswrapper[4907]: I1009 20:07:18.374537 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:18 crc kubenswrapper[4907]: I1009 20:07:18.421846 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rxqv5"] Oct 09 20:07:20 crc kubenswrapper[4907]: I1009 20:07:20.354487 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rxqv5" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerName="registry-server" containerID="cri-o://32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427" gracePeriod=2 Oct 09 20:07:20 crc kubenswrapper[4907]: I1009 20:07:20.814171 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:20 crc kubenswrapper[4907]: I1009 20:07:20.979619 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bwfv\" (UniqueName: \"kubernetes.io/projected/5afea26d-2592-4cc4-b659-4580dfeebbb5-kube-api-access-7bwfv\") pod \"5afea26d-2592-4cc4-b659-4580dfeebbb5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " Oct 09 20:07:20 crc kubenswrapper[4907]: I1009 20:07:20.979969 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-catalog-content\") pod \"5afea26d-2592-4cc4-b659-4580dfeebbb5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " Oct 09 20:07:20 crc kubenswrapper[4907]: I1009 20:07:20.980312 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-utilities\") pod \"5afea26d-2592-4cc4-b659-4580dfeebbb5\" (UID: \"5afea26d-2592-4cc4-b659-4580dfeebbb5\") " Oct 09 20:07:20 crc kubenswrapper[4907]: I1009 20:07:20.981195 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-utilities" (OuterVolumeSpecName: "utilities") pod "5afea26d-2592-4cc4-b659-4580dfeebbb5" (UID: "5afea26d-2592-4cc4-b659-4580dfeebbb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:07:20 crc kubenswrapper[4907]: I1009 20:07:20.985995 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5afea26d-2592-4cc4-b659-4580dfeebbb5-kube-api-access-7bwfv" (OuterVolumeSpecName: "kube-api-access-7bwfv") pod "5afea26d-2592-4cc4-b659-4580dfeebbb5" (UID: "5afea26d-2592-4cc4-b659-4580dfeebbb5"). InnerVolumeSpecName "kube-api-access-7bwfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.083418 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.083507 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bwfv\" (UniqueName: \"kubernetes.io/projected/5afea26d-2592-4cc4-b659-4580dfeebbb5-kube-api-access-7bwfv\") on node \"crc\" DevicePath \"\"" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.159974 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5afea26d-2592-4cc4-b659-4580dfeebbb5" (UID: "5afea26d-2592-4cc4-b659-4580dfeebbb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.186153 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5afea26d-2592-4cc4-b659-4580dfeebbb5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.372182 4907 generic.go:334] "Generic (PLEG): container finished" podID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerID="32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427" exitCode=0 Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.372232 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxqv5" event={"ID":"5afea26d-2592-4cc4-b659-4580dfeebbb5","Type":"ContainerDied","Data":"32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427"} Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.372265 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxqv5" event={"ID":"5afea26d-2592-4cc4-b659-4580dfeebbb5","Type":"ContainerDied","Data":"bbb028afb7afecd5c838159ff0ecec1781b4f45460857113711c8e455e66c08a"} Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.372286 4907 scope.go:117] "RemoveContainer" containerID="32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.372454 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxqv5" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.398633 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rxqv5"] Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.407035 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rxqv5"] Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.410970 4907 scope.go:117] "RemoveContainer" containerID="68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.458267 4907 scope.go:117] "RemoveContainer" containerID="4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.528353 4907 scope.go:117] "RemoveContainer" containerID="32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427" Oct 09 20:07:21 crc kubenswrapper[4907]: E1009 20:07:21.528840 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427\": container with ID starting with 32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427 not found: ID does not exist" containerID="32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.528882 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427"} err="failed to get container status \"32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427\": rpc error: code = NotFound desc = could not find container \"32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427\": container with ID starting with 32b2a7ed728e1703e1a94f6bce0a260734c314caf82866c8b68d103d10e00427 not found: ID does not exist" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.528907 4907 scope.go:117] "RemoveContainer" containerID="68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56" Oct 09 20:07:21 crc kubenswrapper[4907]: E1009 20:07:21.529404 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56\": container with ID starting with 68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56 not found: ID does not exist" containerID="68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.529506 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56"} err="failed to get container status \"68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56\": rpc error: code = NotFound desc = could not find container \"68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56\": container with ID starting with 68c87676284b04fbd8237c5b7adfd8de862c6c2be0fadfea0dc36a132b4efe56 not found: ID does not exist" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.529553 4907 scope.go:117] "RemoveContainer" containerID="4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a" Oct 09 20:07:21 crc kubenswrapper[4907]: E1009 20:07:21.529924 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a\": container with ID starting with 4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a not found: ID does not exist" containerID="4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a" Oct 09 20:07:21 crc kubenswrapper[4907]: I1009 20:07:21.529959 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a"} err="failed to get container status \"4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a\": rpc error: code = NotFound desc = could not find container \"4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a\": container with ID starting with 4f2395788af980ef94649b5cad83a4d5c8e2e2c0c7d62e5f4bcaf57fff19287a not found: ID does not exist" Oct 09 20:07:23 crc kubenswrapper[4907]: I1009 20:07:23.163048 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" path="/var/lib/kubelet/pods/5afea26d-2592-4cc4-b659-4580dfeebbb5/volumes" Oct 09 20:07:25 crc kubenswrapper[4907]: E1009 20:07:25.643329 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2a9664_e6e1_43ee_9b48_f9c220b90116.slice/crio-conmon-ec8a5c378f17ec130b378ba7298d35f5141d1d626da3e8c8e41a02fb1979b406.scope\": RecentStats: unable to find data in memory cache]" Oct 09 20:07:28 crc kubenswrapper[4907]: I1009 20:07:28.152809 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:07:28 crc kubenswrapper[4907]: E1009 20:07:28.154056 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:07:41 crc kubenswrapper[4907]: I1009 20:07:41.152929 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:07:41 crc kubenswrapper[4907]: E1009 20:07:41.154140 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:07:54 crc kubenswrapper[4907]: I1009 20:07:54.698183 4907 generic.go:334] "Generic (PLEG): container finished" podID="313461ee-e16e-42e8-97ef-5e2d16f23cb5" containerID="a53d0533fc7889ce06a23cfb44eb04f20fbfed9704fb67f02308f43bd123bcf8" exitCode=0 Oct 09 20:07:54 crc kubenswrapper[4907]: I1009 20:07:54.698627 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" event={"ID":"313461ee-e16e-42e8-97ef-5e2d16f23cb5","Type":"ContainerDied","Data":"a53d0533fc7889ce06a23cfb44eb04f20fbfed9704fb67f02308f43bd123bcf8"} Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.125024 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.151060 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:07:56 crc kubenswrapper[4907]: E1009 20:07:56.151530 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.301847 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2nj4\" (UniqueName: \"kubernetes.io/projected/313461ee-e16e-42e8-97ef-5e2d16f23cb5-kube-api-access-q2nj4\") pod \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.301989 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-ssh-key\") pod \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.302031 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-secret-0\") pod \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.302055 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-combined-ca-bundle\") pod \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.302186 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-inventory\") pod \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\" (UID: \"313461ee-e16e-42e8-97ef-5e2d16f23cb5\") " Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.307276 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313461ee-e16e-42e8-97ef-5e2d16f23cb5-kube-api-access-q2nj4" (OuterVolumeSpecName: "kube-api-access-q2nj4") pod "313461ee-e16e-42e8-97ef-5e2d16f23cb5" (UID: "313461ee-e16e-42e8-97ef-5e2d16f23cb5"). InnerVolumeSpecName "kube-api-access-q2nj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.317676 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "313461ee-e16e-42e8-97ef-5e2d16f23cb5" (UID: "313461ee-e16e-42e8-97ef-5e2d16f23cb5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.343836 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "313461ee-e16e-42e8-97ef-5e2d16f23cb5" (UID: "313461ee-e16e-42e8-97ef-5e2d16f23cb5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.356183 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "313461ee-e16e-42e8-97ef-5e2d16f23cb5" (UID: "313461ee-e16e-42e8-97ef-5e2d16f23cb5"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.364343 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-inventory" (OuterVolumeSpecName: "inventory") pod "313461ee-e16e-42e8-97ef-5e2d16f23cb5" (UID: "313461ee-e16e-42e8-97ef-5e2d16f23cb5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.404715 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.404907 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2nj4\" (UniqueName: \"kubernetes.io/projected/313461ee-e16e-42e8-97ef-5e2d16f23cb5-kube-api-access-q2nj4\") on node \"crc\" DevicePath \"\"" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.405002 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.405069 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.405128 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313461ee-e16e-42e8-97ef-5e2d16f23cb5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.724729 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" event={"ID":"313461ee-e16e-42e8-97ef-5e2d16f23cb5","Type":"ContainerDied","Data":"45c6313c74d59859db3a71d1660a8327c0a3d7b831d85f3b2cf57b8aa17ce7b9"} Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.724806 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45c6313c74d59859db3a71d1660a8327c0a3d7b831d85f3b2cf57b8aa17ce7b9" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.724858 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.837543 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm"] Oct 09 20:07:56 crc kubenswrapper[4907]: E1009 20:07:56.838062 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerName="extract-utilities" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.838082 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerName="extract-utilities" Oct 09 20:07:56 crc kubenswrapper[4907]: E1009 20:07:56.838102 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerName="extract-content" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.838112 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerName="extract-content" Oct 09 20:07:56 crc kubenswrapper[4907]: E1009 20:07:56.838127 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313461ee-e16e-42e8-97ef-5e2d16f23cb5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.838137 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="313461ee-e16e-42e8-97ef-5e2d16f23cb5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 20:07:56 crc kubenswrapper[4907]: E1009 20:07:56.838187 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerName="registry-server" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.838197 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerName="registry-server" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.838420 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="313461ee-e16e-42e8-97ef-5e2d16f23cb5" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.838455 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5afea26d-2592-4cc4-b659-4580dfeebbb5" containerName="registry-server" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.839285 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.842419 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.843246 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.843310 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.843770 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.844147 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.844169 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.844220 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.852672 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm"] Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.914975 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.915041 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5fgg\" (UniqueName: \"kubernetes.io/projected/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-kube-api-access-q5fgg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.915084 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.915104 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.915346 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.915432 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.915511 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.915661 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:56 crc kubenswrapper[4907]: I1009 20:07:56.916002 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.017733 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.017790 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5fgg\" (UniqueName: \"kubernetes.io/projected/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-kube-api-access-q5fgg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.017837 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.017860 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.017942 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.017966 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.017992 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.018024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.018089 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.019510 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.023375 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.024087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.024520 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.025523 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.026585 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.027398 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.028814 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.047160 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5fgg\" (UniqueName: \"kubernetes.io/projected/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-kube-api-access-q5fgg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj6zm\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.167716 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:07:57 crc kubenswrapper[4907]: I1009 20:07:57.784981 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm"] Oct 09 20:07:58 crc kubenswrapper[4907]: I1009 20:07:58.746131 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" event={"ID":"cf1c00bc-7815-4bf7-8c42-d85c38936b4b","Type":"ContainerStarted","Data":"3f715e91f943f06000270340cd2bc51448ef78850782f8dbc88a38abb8c432e9"} Oct 09 20:07:58 crc kubenswrapper[4907]: I1009 20:07:58.746563 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" event={"ID":"cf1c00bc-7815-4bf7-8c42-d85c38936b4b","Type":"ContainerStarted","Data":"a9c032dd59fde8c0ae7382cd96cfc54c8e7d4580f35da6a7407b8169543c46c5"} Oct 09 20:07:58 crc kubenswrapper[4907]: I1009 20:07:58.772250 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" podStartSLOduration=2.326386982 podStartE2EDuration="2.772232668s" podCreationTimestamp="2025-10-09 20:07:56 +0000 UTC" firstStartedPulling="2025-10-09 20:07:57.791521016 +0000 UTC m=+2363.323488515" lastFinishedPulling="2025-10-09 20:07:58.237366712 +0000 UTC m=+2363.769334201" observedRunningTime="2025-10-09 20:07:58.764931324 +0000 UTC m=+2364.296898813" watchObservedRunningTime="2025-10-09 20:07:58.772232668 +0000 UTC m=+2364.304200157" Oct 09 20:08:09 crc kubenswrapper[4907]: I1009 20:08:09.152505 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:08:09 crc kubenswrapper[4907]: E1009 20:08:09.153635 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:08:20 crc kubenswrapper[4907]: I1009 20:08:20.152180 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:08:20 crc kubenswrapper[4907]: E1009 20:08:20.153592 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:08:31 crc kubenswrapper[4907]: I1009 20:08:31.151431 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:08:31 crc kubenswrapper[4907]: E1009 20:08:31.152854 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:08:44 crc kubenswrapper[4907]: I1009 20:08:44.152128 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:08:44 crc kubenswrapper[4907]: E1009 20:08:44.153026 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:08:59 crc kubenswrapper[4907]: I1009 20:08:59.152050 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:08:59 crc kubenswrapper[4907]: E1009 20:08:59.152803 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:09:10 crc kubenswrapper[4907]: I1009 20:09:10.152511 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:09:10 crc kubenswrapper[4907]: E1009 20:09:10.153535 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:09:23 crc kubenswrapper[4907]: I1009 20:09:23.152265 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:09:23 crc kubenswrapper[4907]: E1009 20:09:23.153617 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:09:36 crc kubenswrapper[4907]: I1009 20:09:36.152391 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:09:36 crc kubenswrapper[4907]: E1009 20:09:36.153878 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:09:50 crc kubenswrapper[4907]: I1009 20:09:50.151529 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:09:50 crc kubenswrapper[4907]: E1009 20:09:50.153612 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:10:01 crc kubenswrapper[4907]: I1009 20:10:01.152109 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:10:01 crc kubenswrapper[4907]: E1009 20:10:01.152952 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:10:16 crc kubenswrapper[4907]: I1009 20:10:16.152029 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:10:16 crc kubenswrapper[4907]: E1009 20:10:16.153400 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:10:31 crc kubenswrapper[4907]: I1009 20:10:31.152543 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:10:31 crc kubenswrapper[4907]: E1009 20:10:31.153919 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:10:45 crc kubenswrapper[4907]: I1009 20:10:45.174007 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:10:45 crc kubenswrapper[4907]: I1009 20:10:45.526773 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"557db504600f3c41488c9227100b74382f080751a479b2a9db5ed710b0f070c4"} Oct 09 20:11:20 crc kubenswrapper[4907]: I1009 20:11:20.915235 4907 generic.go:334] "Generic (PLEG): container finished" podID="cf1c00bc-7815-4bf7-8c42-d85c38936b4b" containerID="3f715e91f943f06000270340cd2bc51448ef78850782f8dbc88a38abb8c432e9" exitCode=0 Oct 09 20:11:20 crc kubenswrapper[4907]: I1009 20:11:20.915343 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" event={"ID":"cf1c00bc-7815-4bf7-8c42-d85c38936b4b","Type":"ContainerDied","Data":"3f715e91f943f06000270340cd2bc51448ef78850782f8dbc88a38abb8c432e9"} Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.396162 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479424 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5fgg\" (UniqueName: \"kubernetes.io/projected/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-kube-api-access-q5fgg\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479512 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-combined-ca-bundle\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479550 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-1\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479578 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-0\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479616 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-1\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479637 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-ssh-key\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479665 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-extra-config-0\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479762 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-inventory\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.479826 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-0\") pod \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\" (UID: \"cf1c00bc-7815-4bf7-8c42-d85c38936b4b\") " Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.494875 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-kube-api-access-q5fgg" (OuterVolumeSpecName: "kube-api-access-q5fgg") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "kube-api-access-q5fgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.497506 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.512453 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-inventory" (OuterVolumeSpecName: "inventory") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.514699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.514714 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.516472 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.516540 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.517544 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.520327 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "cf1c00bc-7815-4bf7-8c42-d85c38936b4b" (UID: "cf1c00bc-7815-4bf7-8c42-d85c38936b4b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581297 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5fgg\" (UniqueName: \"kubernetes.io/projected/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-kube-api-access-q5fgg\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581337 4907 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581350 4907 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581362 4907 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581373 4907 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581386 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581397 4907 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581409 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.581420 4907 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/cf1c00bc-7815-4bf7-8c42-d85c38936b4b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.939088 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" event={"ID":"cf1c00bc-7815-4bf7-8c42-d85c38936b4b","Type":"ContainerDied","Data":"a9c032dd59fde8c0ae7382cd96cfc54c8e7d4580f35da6a7407b8169543c46c5"} Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.939136 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9c032dd59fde8c0ae7382cd96cfc54c8e7d4580f35da6a7407b8169543c46c5" Oct 09 20:11:22 crc kubenswrapper[4907]: I1009 20:11:22.939178 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj6zm" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.058874 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx"] Oct 09 20:11:23 crc kubenswrapper[4907]: E1009 20:11:23.059389 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1c00bc-7815-4bf7-8c42-d85c38936b4b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.059409 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1c00bc-7815-4bf7-8c42-d85c38936b4b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.059673 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1c00bc-7815-4bf7-8c42-d85c38936b4b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.060537 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.063071 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.063743 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.064375 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.064909 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jdgvx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.065685 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.080907 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx"] Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.089660 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.089779 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.089889 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.089922 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56k7g\" (UniqueName: \"kubernetes.io/projected/40270666-7351-4172-b5d9-c523b405ae52-kube-api-access-56k7g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.090014 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.090072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.090102 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.191500 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.191633 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.191692 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.191831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.191877 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.192034 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.193210 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56k7g\" (UniqueName: \"kubernetes.io/projected/40270666-7351-4172-b5d9-c523b405ae52-kube-api-access-56k7g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.197672 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.197867 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.198016 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.198541 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.202931 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.212744 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.222131 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56k7g\" (UniqueName: \"kubernetes.io/projected/40270666-7351-4172-b5d9-c523b405ae52-kube-api-access-56k7g\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.392287 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.922623 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx"] Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.930335 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 20:11:23 crc kubenswrapper[4907]: I1009 20:11:23.951991 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" event={"ID":"40270666-7351-4172-b5d9-c523b405ae52","Type":"ContainerStarted","Data":"05e290c598b500989c2a7262108a7a4b85150235c701a995bb6bdd6eebd5ae6e"} Oct 09 20:11:24 crc kubenswrapper[4907]: I1009 20:11:24.966528 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" event={"ID":"40270666-7351-4172-b5d9-c523b405ae52","Type":"ContainerStarted","Data":"d2381e8c5257824442d1a3f9e074023981cab368248268ca38c79419688e5171"} Oct 09 20:11:24 crc kubenswrapper[4907]: I1009 20:11:24.996272 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" podStartSLOduration=1.55763896 podStartE2EDuration="1.996250466s" podCreationTimestamp="2025-10-09 20:11:23 +0000 UTC" firstStartedPulling="2025-10-09 20:11:23.929937896 +0000 UTC m=+2569.461905405" lastFinishedPulling="2025-10-09 20:11:24.368549392 +0000 UTC m=+2569.900516911" observedRunningTime="2025-10-09 20:11:24.991350817 +0000 UTC m=+2570.523318336" watchObservedRunningTime="2025-10-09 20:11:24.996250466 +0000 UTC m=+2570.528217965" Oct 09 20:13:06 crc kubenswrapper[4907]: I1009 20:13:06.298910 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:13:06 crc kubenswrapper[4907]: I1009 20:13:06.299356 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:13:36 crc kubenswrapper[4907]: I1009 20:13:36.300065 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:13:36 crc kubenswrapper[4907]: I1009 20:13:36.300898 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:13:51 crc kubenswrapper[4907]: I1009 20:13:51.518328 4907 generic.go:334] "Generic (PLEG): container finished" podID="40270666-7351-4172-b5d9-c523b405ae52" containerID="d2381e8c5257824442d1a3f9e074023981cab368248268ca38c79419688e5171" exitCode=0 Oct 09 20:13:51 crc kubenswrapper[4907]: I1009 20:13:51.518393 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" event={"ID":"40270666-7351-4172-b5d9-c523b405ae52","Type":"ContainerDied","Data":"d2381e8c5257824442d1a3f9e074023981cab368248268ca38c79419688e5171"} Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.019845 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.182547 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-inventory\") pod \"40270666-7351-4172-b5d9-c523b405ae52\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.182588 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-2\") pod \"40270666-7351-4172-b5d9-c523b405ae52\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.182689 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-1\") pod \"40270666-7351-4172-b5d9-c523b405ae52\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.182944 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-telemetry-combined-ca-bundle\") pod \"40270666-7351-4172-b5d9-c523b405ae52\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.183021 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56k7g\" (UniqueName: \"kubernetes.io/projected/40270666-7351-4172-b5d9-c523b405ae52-kube-api-access-56k7g\") pod \"40270666-7351-4172-b5d9-c523b405ae52\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.183201 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-0\") pod \"40270666-7351-4172-b5d9-c523b405ae52\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.183249 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ssh-key\") pod \"40270666-7351-4172-b5d9-c523b405ae52\" (UID: \"40270666-7351-4172-b5d9-c523b405ae52\") " Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.190149 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "40270666-7351-4172-b5d9-c523b405ae52" (UID: "40270666-7351-4172-b5d9-c523b405ae52"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.190206 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40270666-7351-4172-b5d9-c523b405ae52-kube-api-access-56k7g" (OuterVolumeSpecName: "kube-api-access-56k7g") pod "40270666-7351-4172-b5d9-c523b405ae52" (UID: "40270666-7351-4172-b5d9-c523b405ae52"). InnerVolumeSpecName "kube-api-access-56k7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.214775 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "40270666-7351-4172-b5d9-c523b405ae52" (UID: "40270666-7351-4172-b5d9-c523b405ae52"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.215169 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-inventory" (OuterVolumeSpecName: "inventory") pod "40270666-7351-4172-b5d9-c523b405ae52" (UID: "40270666-7351-4172-b5d9-c523b405ae52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.222588 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "40270666-7351-4172-b5d9-c523b405ae52" (UID: "40270666-7351-4172-b5d9-c523b405ae52"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.223639 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "40270666-7351-4172-b5d9-c523b405ae52" (UID: "40270666-7351-4172-b5d9-c523b405ae52"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.228544 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "40270666-7351-4172-b5d9-c523b405ae52" (UID: "40270666-7351-4172-b5d9-c523b405ae52"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.286109 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56k7g\" (UniqueName: \"kubernetes.io/projected/40270666-7351-4172-b5d9-c523b405ae52-kube-api-access-56k7g\") on node \"crc\" DevicePath \"\"" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.286161 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.286183 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.286201 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.286219 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.286238 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.286256 4907 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40270666-7351-4172-b5d9-c523b405ae52-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.539910 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" event={"ID":"40270666-7351-4172-b5d9-c523b405ae52","Type":"ContainerDied","Data":"05e290c598b500989c2a7262108a7a4b85150235c701a995bb6bdd6eebd5ae6e"} Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.540020 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05e290c598b500989c2a7262108a7a4b85150235c701a995bb6bdd6eebd5ae6e" Oct 09 20:13:53 crc kubenswrapper[4907]: I1009 20:13:53.539960 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx" Oct 09 20:14:00 crc kubenswrapper[4907]: I1009 20:14:00.931662 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 20:14:00 crc kubenswrapper[4907]: I1009 20:14:00.932452 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="7338c5b1-5214-40d6-a82c-f1f53697a06a" containerName="kube-state-metrics" containerID="cri-o://dee68b40af7e691a836abbf7a15b3773a698f5c1d80eda642569a7ba1377ce7b" gracePeriod=30 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.004404 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.004703 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="ceilometer-notification-agent" containerID="cri-o://79761f4f7288208cdb015650c604aff38ff244c4833d0aa6aaef8971055ac638" gracePeriod=30 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.004784 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="proxy-httpd" containerID="cri-o://b52a33ba048abbb2ee4170dc75c120902fca79f2296ad600b7dbc6b25f6bf192" gracePeriod=30 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.004858 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="sg-core" containerID="cri-o://b59fef055a607abe26d6dd7b4e6c0f42fb27cc1dba53968467e95667208afcbb" gracePeriod=30 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.004921 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="ceilometer-central-agent" containerID="cri-o://2b92bb2c6b19233b35f0cd7a26509483b6c9f9115bb5d2c4b567a12151e04c1e" gracePeriod=30 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.211916 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc"] Oct 09 20:14:01 crc kubenswrapper[4907]: E1009 20:14:01.212738 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40270666-7351-4172-b5d9-c523b405ae52" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.212766 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="40270666-7351-4172-b5d9-c523b405ae52" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.213027 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="40270666-7351-4172-b5d9-c523b405ae52" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.228686 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc"] Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.228807 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.236307 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.343995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxfg\" (UniqueName: \"kubernetes.io/projected/20e43860-0c38-4f47-83e6-147765347183-kube-api-access-wrxfg\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.344202 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.344329 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.446119 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.446222 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.446349 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxfg\" (UniqueName: \"kubernetes.io/projected/20e43860-0c38-4f47-83e6-147765347183-kube-api-access-wrxfg\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.446719 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-bundle\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.446886 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-util\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.474171 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxfg\" (UniqueName: \"kubernetes.io/projected/20e43860-0c38-4f47-83e6-147765347183-kube-api-access-wrxfg\") pod \"142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.556534 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.658273 4907 generic.go:334] "Generic (PLEG): container finished" podID="7338c5b1-5214-40d6-a82c-f1f53697a06a" containerID="dee68b40af7e691a836abbf7a15b3773a698f5c1d80eda642569a7ba1377ce7b" exitCode=2 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.658678 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7338c5b1-5214-40d6-a82c-f1f53697a06a","Type":"ContainerDied","Data":"dee68b40af7e691a836abbf7a15b3773a698f5c1d80eda642569a7ba1377ce7b"} Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.674926 4907 generic.go:334] "Generic (PLEG): container finished" podID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerID="2b92bb2c6b19233b35f0cd7a26509483b6c9f9115bb5d2c4b567a12151e04c1e" exitCode=0 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.674969 4907 generic.go:334] "Generic (PLEG): container finished" podID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerID="b52a33ba048abbb2ee4170dc75c120902fca79f2296ad600b7dbc6b25f6bf192" exitCode=0 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.674982 4907 generic.go:334] "Generic (PLEG): container finished" podID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerID="b59fef055a607abe26d6dd7b4e6c0f42fb27cc1dba53968467e95667208afcbb" exitCode=2 Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.675004 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerDied","Data":"2b92bb2c6b19233b35f0cd7a26509483b6c9f9115bb5d2c4b567a12151e04c1e"} Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.675035 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerDied","Data":"b52a33ba048abbb2ee4170dc75c120902fca79f2296ad600b7dbc6b25f6bf192"} Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.675051 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerDied","Data":"b59fef055a607abe26d6dd7b4e6c0f42fb27cc1dba53968467e95667208afcbb"} Oct 09 20:14:01 crc kubenswrapper[4907]: I1009 20:14:01.929338 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.057111 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-certs\") pod \"7338c5b1-5214-40d6-a82c-f1f53697a06a\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.057176 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-combined-ca-bundle\") pod \"7338c5b1-5214-40d6-a82c-f1f53697a06a\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.057399 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmftr\" (UniqueName: \"kubernetes.io/projected/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-api-access-pmftr\") pod \"7338c5b1-5214-40d6-a82c-f1f53697a06a\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.057456 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-config\") pod \"7338c5b1-5214-40d6-a82c-f1f53697a06a\" (UID: \"7338c5b1-5214-40d6-a82c-f1f53697a06a\") " Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.072736 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-api-access-pmftr" (OuterVolumeSpecName: "kube-api-access-pmftr") pod "7338c5b1-5214-40d6-a82c-f1f53697a06a" (UID: "7338c5b1-5214-40d6-a82c-f1f53697a06a"). InnerVolumeSpecName "kube-api-access-pmftr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.092378 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "7338c5b1-5214-40d6-a82c-f1f53697a06a" (UID: "7338c5b1-5214-40d6-a82c-f1f53697a06a"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.098027 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7338c5b1-5214-40d6-a82c-f1f53697a06a" (UID: "7338c5b1-5214-40d6-a82c-f1f53697a06a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.115050 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc"] Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.135679 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "7338c5b1-5214-40d6-a82c-f1f53697a06a" (UID: "7338c5b1-5214-40d6-a82c-f1f53697a06a"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.161640 4907 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.161678 4907 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.161693 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338c5b1-5214-40d6-a82c-f1f53697a06a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.161706 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmftr\" (UniqueName: \"kubernetes.io/projected/7338c5b1-5214-40d6-a82c-f1f53697a06a-kube-api-access-pmftr\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.687584 4907 generic.go:334] "Generic (PLEG): container finished" podID="20e43860-0c38-4f47-83e6-147765347183" containerID="70beed167bb1cfa6b37994e1c8844f770046125866d243e3a97e6d7fe3eaac4f" exitCode=0 Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.687651 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" event={"ID":"20e43860-0c38-4f47-83e6-147765347183","Type":"ContainerDied","Data":"70beed167bb1cfa6b37994e1c8844f770046125866d243e3a97e6d7fe3eaac4f"} Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.687970 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" event={"ID":"20e43860-0c38-4f47-83e6-147765347183","Type":"ContainerStarted","Data":"79727ae7ad74d2d509400f0ab887ef0d6a7ef32c33e5af2437866f0dd3766585"} Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.691859 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7338c5b1-5214-40d6-a82c-f1f53697a06a","Type":"ContainerDied","Data":"e8446bd6d5315098ed32e6e0dbe2ce2d9374ee4278b03558f46f39bfcdaf3e6a"} Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.691902 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.691924 4907 scope.go:117] "RemoveContainer" containerID="dee68b40af7e691a836abbf7a15b3773a698f5c1d80eda642569a7ba1377ce7b" Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.739935 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 20:14:02 crc kubenswrapper[4907]: I1009 20:14:02.748484 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 20:14:03 crc kubenswrapper[4907]: I1009 20:14:03.203915 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7338c5b1-5214-40d6-a82c-f1f53697a06a" path="/var/lib/kubelet/pods/7338c5b1-5214-40d6-a82c-f1f53697a06a/volumes" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.555113 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Oct 09 20:14:04 crc kubenswrapper[4907]: E1009 20:14:04.555808 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7338c5b1-5214-40d6-a82c-f1f53697a06a" containerName="kube-state-metrics" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.555825 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7338c5b1-5214-40d6-a82c-f1f53697a06a" containerName="kube-state-metrics" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.556160 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7338c5b1-5214-40d6-a82c-f1f53697a06a" containerName="kube-state-metrics" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.556974 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.559621 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.560863 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.568853 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.712331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7985b40-9182-4fee-9273-421e43faca7a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7985b40-9182-4fee-9273-421e43faca7a\") pod \"minio\" (UID: \"75ec575f-175f-4d3f-a66c-82fbd1ebc822\") " pod="minio-dev/minio" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.712439 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kq6b\" (UniqueName: \"kubernetes.io/projected/75ec575f-175f-4d3f-a66c-82fbd1ebc822-kube-api-access-7kq6b\") pod \"minio\" (UID: \"75ec575f-175f-4d3f-a66c-82fbd1ebc822\") " pod="minio-dev/minio" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.714796 4907 generic.go:334] "Generic (PLEG): container finished" podID="20e43860-0c38-4f47-83e6-147765347183" containerID="50cc38ca41cbd2dc830663751d255038d068b3500098a07d27258da79b79fe7d" exitCode=0 Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.714837 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" event={"ID":"20e43860-0c38-4f47-83e6-147765347183","Type":"ContainerDied","Data":"50cc38ca41cbd2dc830663751d255038d068b3500098a07d27258da79b79fe7d"} Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.814549 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kq6b\" (UniqueName: \"kubernetes.io/projected/75ec575f-175f-4d3f-a66c-82fbd1ebc822-kube-api-access-7kq6b\") pod \"minio\" (UID: \"75ec575f-175f-4d3f-a66c-82fbd1ebc822\") " pod="minio-dev/minio" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.814687 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7985b40-9182-4fee-9273-421e43faca7a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7985b40-9182-4fee-9273-421e43faca7a\") pod \"minio\" (UID: \"75ec575f-175f-4d3f-a66c-82fbd1ebc822\") " pod="minio-dev/minio" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.817308 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.817346 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7985b40-9182-4fee-9273-421e43faca7a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7985b40-9182-4fee-9273-421e43faca7a\") pod \"minio\" (UID: \"75ec575f-175f-4d3f-a66c-82fbd1ebc822\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b6ceee8ce143ffaf424e79eec619bbc0500561bef455bbd17d5981ba8a4f5d79/globalmount\"" pod="minio-dev/minio" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.835926 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kq6b\" (UniqueName: \"kubernetes.io/projected/75ec575f-175f-4d3f-a66c-82fbd1ebc822-kube-api-access-7kq6b\") pod \"minio\" (UID: \"75ec575f-175f-4d3f-a66c-82fbd1ebc822\") " pod="minio-dev/minio" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.853264 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7985b40-9182-4fee-9273-421e43faca7a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7985b40-9182-4fee-9273-421e43faca7a\") pod \"minio\" (UID: \"75ec575f-175f-4d3f-a66c-82fbd1ebc822\") " pod="minio-dev/minio" Oct 09 20:14:04 crc kubenswrapper[4907]: I1009 20:14:04.876233 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Oct 09 20:14:05 crc kubenswrapper[4907]: I1009 20:14:05.312333 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Oct 09 20:14:05 crc kubenswrapper[4907]: I1009 20:14:05.725879 4907 generic.go:334] "Generic (PLEG): container finished" podID="20e43860-0c38-4f47-83e6-147765347183" containerID="31924b42caa40ae4e51775a21651da3a89b4a83cb05efb603714a2b110c81545" exitCode=0 Oct 09 20:14:05 crc kubenswrapper[4907]: I1009 20:14:05.726286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" event={"ID":"20e43860-0c38-4f47-83e6-147765347183","Type":"ContainerDied","Data":"31924b42caa40ae4e51775a21651da3a89b4a83cb05efb603714a2b110c81545"} Oct 09 20:14:05 crc kubenswrapper[4907]: I1009 20:14:05.728223 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"75ec575f-175f-4d3f-a66c-82fbd1ebc822","Type":"ContainerStarted","Data":"5c9bac2d69cfb42f2708bdb7331a674e8681d7c0a8e5721f6dad5f078d85b0ba"} Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.299719 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.299992 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.300035 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.300925 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"557db504600f3c41488c9227100b74382f080751a479b2a9db5ed710b0f070c4"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.300984 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://557db504600f3c41488c9227100b74382f080751a479b2a9db5ed710b0f070c4" gracePeriod=600 Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.741668 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="557db504600f3c41488c9227100b74382f080751a479b2a9db5ed710b0f070c4" exitCode=0 Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.741769 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"557db504600f3c41488c9227100b74382f080751a479b2a9db5ed710b0f070c4"} Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.741833 4907 scope.go:117] "RemoveContainer" containerID="257f18d2337a9e3d81a1c6f68fb450e123dd41eb436896e7c025a0a04398c9e7" Oct 09 20:14:06 crc kubenswrapper[4907]: I1009 20:14:06.873934 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="7338c5b1-5214-40d6-a82c-f1f53697a06a" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.200:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.102723 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.169737 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrxfg\" (UniqueName: \"kubernetes.io/projected/20e43860-0c38-4f47-83e6-147765347183-kube-api-access-wrxfg\") pod \"20e43860-0c38-4f47-83e6-147765347183\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.176694 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e43860-0c38-4f47-83e6-147765347183-kube-api-access-wrxfg" (OuterVolumeSpecName: "kube-api-access-wrxfg") pod "20e43860-0c38-4f47-83e6-147765347183" (UID: "20e43860-0c38-4f47-83e6-147765347183"). InnerVolumeSpecName "kube-api-access-wrxfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.271189 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-bundle\") pod \"20e43860-0c38-4f47-83e6-147765347183\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.271384 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-util\") pod \"20e43860-0c38-4f47-83e6-147765347183\" (UID: \"20e43860-0c38-4f47-83e6-147765347183\") " Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.271822 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrxfg\" (UniqueName: \"kubernetes.io/projected/20e43860-0c38-4f47-83e6-147765347183-kube-api-access-wrxfg\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.271994 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-bundle" (OuterVolumeSpecName: "bundle") pod "20e43860-0c38-4f47-83e6-147765347183" (UID: "20e43860-0c38-4f47-83e6-147765347183"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.287865 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-util" (OuterVolumeSpecName: "util") pod "20e43860-0c38-4f47-83e6-147765347183" (UID: "20e43860-0c38-4f47-83e6-147765347183"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.374109 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-util\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.374362 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e43860-0c38-4f47-83e6-147765347183-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.752190 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" event={"ID":"20e43860-0c38-4f47-83e6-147765347183","Type":"ContainerDied","Data":"79727ae7ad74d2d509400f0ab887ef0d6a7ef32c33e5af2437866f0dd3766585"} Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.752239 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79727ae7ad74d2d509400f0ab887ef0d6a7ef32c33e5af2437866f0dd3766585" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.752314 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc" Oct 09 20:14:07 crc kubenswrapper[4907]: I1009 20:14:07.756184 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c"} Oct 09 20:14:08 crc kubenswrapper[4907]: I1009 20:14:08.693359 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.210:3000/\": dial tcp 10.217.0.210:3000: connect: connection refused" Oct 09 20:14:08 crc kubenswrapper[4907]: I1009 20:14:08.766379 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"75ec575f-175f-4d3f-a66c-82fbd1ebc822","Type":"ContainerStarted","Data":"afb015b81b06aa044608d50b4a590e1218cdd98cd7900c79458e19de72810cb4"} Oct 09 20:14:08 crc kubenswrapper[4907]: I1009 20:14:08.784559 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.585047131 podStartE2EDuration="7.784537558s" podCreationTimestamp="2025-10-09 20:14:01 +0000 UTC" firstStartedPulling="2025-10-09 20:14:05.322826788 +0000 UTC m=+2730.854794277" lastFinishedPulling="2025-10-09 20:14:08.522317215 +0000 UTC m=+2734.054284704" observedRunningTime="2025-10-09 20:14:08.777445711 +0000 UTC m=+2734.309413220" watchObservedRunningTime="2025-10-09 20:14:08.784537558 +0000 UTC m=+2734.316505047" Oct 09 20:14:09 crc kubenswrapper[4907]: I1009 20:14:09.823700 4907 generic.go:334] "Generic (PLEG): container finished" podID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerID="79761f4f7288208cdb015650c604aff38ff244c4833d0aa6aaef8971055ac638" exitCode=0 Oct 09 20:14:09 crc kubenswrapper[4907]: I1009 20:14:09.824187 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerDied","Data":"79761f4f7288208cdb015650c604aff38ff244c4833d0aa6aaef8971055ac638"} Oct 09 20:14:09 crc kubenswrapper[4907]: I1009 20:14:09.993748 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.126617 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-config-data\") pod \"93a1e245-baac-44c9-ba36-46e2af13f3ea\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.126985 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-scripts\") pod \"93a1e245-baac-44c9-ba36-46e2af13f3ea\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.127038 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-run-httpd\") pod \"93a1e245-baac-44c9-ba36-46e2af13f3ea\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.127062 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bzjf\" (UniqueName: \"kubernetes.io/projected/93a1e245-baac-44c9-ba36-46e2af13f3ea-kube-api-access-9bzjf\") pod \"93a1e245-baac-44c9-ba36-46e2af13f3ea\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.127078 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-sg-core-conf-yaml\") pod \"93a1e245-baac-44c9-ba36-46e2af13f3ea\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.127104 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-ceilometer-tls-certs\") pod \"93a1e245-baac-44c9-ba36-46e2af13f3ea\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.127127 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-combined-ca-bundle\") pod \"93a1e245-baac-44c9-ba36-46e2af13f3ea\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.127203 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-log-httpd\") pod \"93a1e245-baac-44c9-ba36-46e2af13f3ea\" (UID: \"93a1e245-baac-44c9-ba36-46e2af13f3ea\") " Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.127945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93a1e245-baac-44c9-ba36-46e2af13f3ea" (UID: "93a1e245-baac-44c9-ba36-46e2af13f3ea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.129975 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93a1e245-baac-44c9-ba36-46e2af13f3ea" (UID: "93a1e245-baac-44c9-ba36-46e2af13f3ea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.133677 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a1e245-baac-44c9-ba36-46e2af13f3ea-kube-api-access-9bzjf" (OuterVolumeSpecName: "kube-api-access-9bzjf") pod "93a1e245-baac-44c9-ba36-46e2af13f3ea" (UID: "93a1e245-baac-44c9-ba36-46e2af13f3ea"). InnerVolumeSpecName "kube-api-access-9bzjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.145096 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-scripts" (OuterVolumeSpecName: "scripts") pod "93a1e245-baac-44c9-ba36-46e2af13f3ea" (UID: "93a1e245-baac-44c9-ba36-46e2af13f3ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.155130 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93a1e245-baac-44c9-ba36-46e2af13f3ea" (UID: "93a1e245-baac-44c9-ba36-46e2af13f3ea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.185556 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "93a1e245-baac-44c9-ba36-46e2af13f3ea" (UID: "93a1e245-baac-44c9-ba36-46e2af13f3ea"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.204362 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93a1e245-baac-44c9-ba36-46e2af13f3ea" (UID: "93a1e245-baac-44c9-ba36-46e2af13f3ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.225246 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-config-data" (OuterVolumeSpecName: "config-data") pod "93a1e245-baac-44c9-ba36-46e2af13f3ea" (UID: "93a1e245-baac-44c9-ba36-46e2af13f3ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.229615 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.229645 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.229655 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bzjf\" (UniqueName: \"kubernetes.io/projected/93a1e245-baac-44c9-ba36-46e2af13f3ea-kube-api-access-9bzjf\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.229666 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.229680 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.229689 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.229698 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93a1e245-baac-44c9-ba36-46e2af13f3ea-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.229706 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a1e245-baac-44c9-ba36-46e2af13f3ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.836945 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93a1e245-baac-44c9-ba36-46e2af13f3ea","Type":"ContainerDied","Data":"1761bd048d74818883c24f60c9af82d4dd0098da9662192d637c12459191a82a"} Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.837022 4907 scope.go:117] "RemoveContainer" containerID="2b92bb2c6b19233b35f0cd7a26509483b6c9f9115bb5d2c4b567a12151e04c1e" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.837040 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.874655 4907 scope.go:117] "RemoveContainer" containerID="b52a33ba048abbb2ee4170dc75c120902fca79f2296ad600b7dbc6b25f6bf192" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.880197 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.892020 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.907804 4907 scope.go:117] "RemoveContainer" containerID="b59fef055a607abe26d6dd7b4e6c0f42fb27cc1dba53968467e95667208afcbb" Oct 09 20:14:10 crc kubenswrapper[4907]: I1009 20:14:10.944107 4907 scope.go:117] "RemoveContainer" containerID="79761f4f7288208cdb015650c604aff38ff244c4833d0aa6aaef8971055ac638" Oct 09 20:14:11 crc kubenswrapper[4907]: I1009 20:14:11.162141 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" path="/var/lib/kubelet/pods/93a1e245-baac-44c9-ba36-46e2af13f3ea/volumes" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.478514 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.479388 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="0f97d607-4cf4-4c31-85eb-462554b18b34" containerName="openstackclient" containerID="cri-o://377ede7bbce4a592098c5a202b7c32a4c9639ca9717fcb6cf82c131db21ce35f" gracePeriod=2 Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.489099 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.530817 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 09 20:14:14 crc kubenswrapper[4907]: E1009 20:14:14.531154 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e43860-0c38-4f47-83e6-147765347183" containerName="util" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531168 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e43860-0c38-4f47-83e6-147765347183" containerName="util" Oct 09 20:14:14 crc kubenswrapper[4907]: E1009 20:14:14.531186 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="ceilometer-central-agent" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531194 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="ceilometer-central-agent" Oct 09 20:14:14 crc kubenswrapper[4907]: E1009 20:14:14.531208 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="ceilometer-notification-agent" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531214 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="ceilometer-notification-agent" Oct 09 20:14:14 crc kubenswrapper[4907]: E1009 20:14:14.531228 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e43860-0c38-4f47-83e6-147765347183" containerName="extract" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531234 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e43860-0c38-4f47-83e6-147765347183" containerName="extract" Oct 09 20:14:14 crc kubenswrapper[4907]: E1009 20:14:14.531251 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="proxy-httpd" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531257 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="proxy-httpd" Oct 09 20:14:14 crc kubenswrapper[4907]: E1009 20:14:14.531270 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f97d607-4cf4-4c31-85eb-462554b18b34" containerName="openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531276 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f97d607-4cf4-4c31-85eb-462554b18b34" containerName="openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: E1009 20:14:14.531291 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e43860-0c38-4f47-83e6-147765347183" containerName="pull" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531297 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e43860-0c38-4f47-83e6-147765347183" containerName="pull" Oct 09 20:14:14 crc kubenswrapper[4907]: E1009 20:14:14.531312 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="sg-core" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531319 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="sg-core" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531530 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="sg-core" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531546 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="proxy-httpd" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531554 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e43860-0c38-4f47-83e6-147765347183" containerName="extract" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531562 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="ceilometer-notification-agent" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531578 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a1e245-baac-44c9-ba36-46e2af13f3ea" containerName="ceilometer-central-agent" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.531590 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f97d607-4cf4-4c31-85eb-462554b18b34" containerName="openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.532164 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.551759 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0f97d607-4cf4-4c31-85eb-462554b18b34" podUID="cd3f8f7d-d0f5-4719-a490-d823cf3c8b23" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.573668 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.627070 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-openstack-config\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.627422 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mpls\" (UniqueName: \"kubernetes.io/projected/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-kube-api-access-8mpls\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.627727 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-openstack-config-secret\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.627859 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.730193 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-openstack-config\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.731779 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mpls\" (UniqueName: \"kubernetes.io/projected/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-kube-api-access-8mpls\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.732339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-openstack-config-secret\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.733671 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.731690 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-openstack-config\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.747127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-openstack-config-secret\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.747636 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.774077 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mpls\" (UniqueName: \"kubernetes.io/projected/cd3f8f7d-d0f5-4719-a490-d823cf3c8b23-kube-api-access-8mpls\") pod \"openstackclient\" (UID: \"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23\") " pod="openstack/openstackclient" Oct 09 20:14:14 crc kubenswrapper[4907]: I1009 20:14:14.855385 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.077347 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf"] Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.100982 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.104573 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.249606 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsl2x\" (UniqueName: \"kubernetes.io/projected/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-kube-api-access-zsl2x\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.250227 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.252665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.255791 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf"] Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.358617 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsl2x\" (UniqueName: \"kubernetes.io/projected/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-kube-api-access-zsl2x\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.358707 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.358751 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.359168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-util\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.360825 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-bundle\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.418491 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsl2x\" (UniqueName: \"kubernetes.io/projected/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-kube-api-access-zsl2x\") pod \"03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.462976 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.719555 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.721671 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.732726 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-q22dg" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.732910 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.733012 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.741223 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.758216 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.774516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.774566 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.774591 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kj88\" (UniqueName: \"kubernetes.io/projected/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-kube-api-access-8kj88\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.774752 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.774813 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.774955 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.824182 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.877426 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.877831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.877854 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.877871 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kj88\" (UniqueName: \"kubernetes.io/projected/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-kube-api-access-8kj88\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.877904 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.877924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.879586 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.889545 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.889828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.895198 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.898104 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.958504 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kj88\" (UniqueName: \"kubernetes.io/projected/2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee-kube-api-access-8kj88\") pod \"alertmanager-metric-storage-0\" (UID: \"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee\") " pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:15 crc kubenswrapper[4907]: I1009 20:14:15.992684 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23","Type":"ContainerStarted","Data":"99d3f16cd505c8962a67f06d8895911900053c43ced21ab4d369ae8865fd208c"} Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.118940 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.223141 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.225775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.230824 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.231041 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.231226 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5q4c6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.231391 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.231406 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.231661 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.247860 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.296219 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89bdfac5-05c6-427c-bf5e-786017f9dd26-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.296291 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d1b87ed6-6911-4cb5-8906-90e9e8b24c70\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1b87ed6-6911-4cb5-8906-90e9e8b24c70\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.296498 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89bdfac5-05c6-427c-bf5e-786017f9dd26-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.296857 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-config\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.296885 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l8wk\" (UniqueName: \"kubernetes.io/projected/89bdfac5-05c6-427c-bf5e-786017f9dd26-kube-api-access-9l8wk\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.296948 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.296982 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89bdfac5-05c6-427c-bf5e-786017f9dd26-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.297088 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.405907 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-config\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.405968 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l8wk\" (UniqueName: \"kubernetes.io/projected/89bdfac5-05c6-427c-bf5e-786017f9dd26-kube-api-access-9l8wk\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.406033 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.406070 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89bdfac5-05c6-427c-bf5e-786017f9dd26-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.406195 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.406294 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89bdfac5-05c6-427c-bf5e-786017f9dd26-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.406352 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d1b87ed6-6911-4cb5-8906-90e9e8b24c70\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1b87ed6-6911-4cb5-8906-90e9e8b24c70\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.406377 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89bdfac5-05c6-427c-bf5e-786017f9dd26-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.407294 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89bdfac5-05c6-427c-bf5e-786017f9dd26-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.425272 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89bdfac5-05c6-427c-bf5e-786017f9dd26-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.426752 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-config\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.438254 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf"] Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.444043 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l8wk\" (UniqueName: \"kubernetes.io/projected/89bdfac5-05c6-427c-bf5e-786017f9dd26-kube-api-access-9l8wk\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.447331 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.447453 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89bdfac5-05c6-427c-bf5e-786017f9dd26-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.449302 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89bdfac5-05c6-427c-bf5e-786017f9dd26-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.450886 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.450912 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d1b87ed6-6911-4cb5-8906-90e9e8b24c70\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1b87ed6-6911-4cb5-8906-90e9e8b24c70\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1a95f39f2431785c6361af32a1e2f97776daa8a458e3617a29a4f7a2a65c1545/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.552611 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6"] Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.554633 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.561185 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.561549 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.561680 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.561789 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.561977 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.562098 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-jw9hv" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.596114 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6"] Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.721280 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-apiservice-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.722346 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.722407 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-webhook-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.722446 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g665n\" (UniqueName: \"kubernetes.io/projected/584227b0-c217-4ec6-81fc-195bf4da68f3-kube-api-access-g665n\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.722573 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/584227b0-c217-4ec6-81fc-195bf4da68f3-manager-config\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.723828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d1b87ed6-6911-4cb5-8906-90e9e8b24c70\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d1b87ed6-6911-4cb5-8906-90e9e8b24c70\") pod \"prometheus-metric-storage-0\" (UID: \"89bdfac5-05c6-427c-bf5e-786017f9dd26\") " pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.829866 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-apiservice-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.830124 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.830166 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-webhook-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.830193 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g665n\" (UniqueName: \"kubernetes.io/projected/584227b0-c217-4ec6-81fc-195bf4da68f3-kube-api-access-g665n\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.830225 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/584227b0-c217-4ec6-81fc-195bf4da68f3-manager-config\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.831020 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/584227b0-c217-4ec6-81fc-195bf4da68f3-manager-config\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.839388 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.852337 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-apiservice-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.878136 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/584227b0-c217-4ec6-81fc-195bf4da68f3-webhook-cert\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.878153 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g665n\" (UniqueName: \"kubernetes.io/projected/584227b0-c217-4ec6-81fc-195bf4da68f3-kube-api-access-g665n\") pod \"loki-operator-controller-manager-6978c7c7cf-jnfh6\" (UID: \"584227b0-c217-4ec6-81fc-195bf4da68f3\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.886774 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 09 20:14:16 crc kubenswrapper[4907]: W1009 20:14:16.890618 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c2c1206_5a3b_4d9f_954f_a42d6c6ef0ee.slice/crio-9dfdc73816fc0d4076baec55c76cb7325e5ff0b3f02655daae428ab643319e46 WatchSource:0}: Error finding container 9dfdc73816fc0d4076baec55c76cb7325e5ff0b3f02655daae428ab643319e46: Status 404 returned error can't find the container with id 9dfdc73816fc0d4076baec55c76cb7325e5ff0b3f02655daae428ab643319e46 Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.936417 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:16 crc kubenswrapper[4907]: I1009 20:14:16.966341 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.089760 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee","Type":"ContainerStarted","Data":"9dfdc73816fc0d4076baec55c76cb7325e5ff0b3f02655daae428ab643319e46"} Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.094097 4907 generic.go:334] "Generic (PLEG): container finished" podID="0f97d607-4cf4-4c31-85eb-462554b18b34" containerID="377ede7bbce4a592098c5a202b7c32a4c9639ca9717fcb6cf82c131db21ce35f" exitCode=137 Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.098909 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" event={"ID":"7b681ab0-8c28-47ed-9f9c-77f233a4ad91","Type":"ContainerDied","Data":"2d12d3a8b8638669a4d3a0d85ba6d99efc322e75a79c9580329fc10dd1f94bf7"} Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.098877 4907 generic.go:334] "Generic (PLEG): container finished" podID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerID="2d12d3a8b8638669a4d3a0d85ba6d99efc322e75a79c9580329fc10dd1f94bf7" exitCode=0 Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.099118 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" event={"ID":"7b681ab0-8c28-47ed-9f9c-77f233a4ad91","Type":"ContainerStarted","Data":"324072ee32b75522285285a0699ad44fbf248092b5fc508c4d7638b51630b92f"} Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.114198 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cd3f8f7d-d0f5-4719-a490-d823cf3c8b23","Type":"ContainerStarted","Data":"54bf5f78c497750c75e0bf2a435f0d58c27f192bbe4e2baa1a2651057d95f198"} Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.150999 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.15098443 podStartE2EDuration="3.15098443s" podCreationTimestamp="2025-10-09 20:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 20:14:17.147877733 +0000 UTC m=+2742.679845232" watchObservedRunningTime="2025-10-09 20:14:17.15098443 +0000 UTC m=+2742.682951919" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.175654 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.179447 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0f97d607-4cf4-4c31-85eb-462554b18b34" podUID="cd3f8f7d-d0f5-4719-a490-d823cf3c8b23" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.357884 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config\") pod \"0f97d607-4cf4-4c31-85eb-462554b18b34\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.358305 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config-secret\") pod \"0f97d607-4cf4-4c31-85eb-462554b18b34\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.358492 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-combined-ca-bundle\") pod \"0f97d607-4cf4-4c31-85eb-462554b18b34\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.358574 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lr7d\" (UniqueName: \"kubernetes.io/projected/0f97d607-4cf4-4c31-85eb-462554b18b34-kube-api-access-4lr7d\") pod \"0f97d607-4cf4-4c31-85eb-462554b18b34\" (UID: \"0f97d607-4cf4-4c31-85eb-462554b18b34\") " Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.365746 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f97d607-4cf4-4c31-85eb-462554b18b34-kube-api-access-4lr7d" (OuterVolumeSpecName: "kube-api-access-4lr7d") pod "0f97d607-4cf4-4c31-85eb-462554b18b34" (UID: "0f97d607-4cf4-4c31-85eb-462554b18b34"). InnerVolumeSpecName "kube-api-access-4lr7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.400938 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0f97d607-4cf4-4c31-85eb-462554b18b34" (UID: "0f97d607-4cf4-4c31-85eb-462554b18b34"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.405119 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f97d607-4cf4-4c31-85eb-462554b18b34" (UID: "0f97d607-4cf4-4c31-85eb-462554b18b34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.431342 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0f97d607-4cf4-4c31-85eb-462554b18b34" (UID: "0f97d607-4cf4-4c31-85eb-462554b18b34"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.461532 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lr7d\" (UniqueName: \"kubernetes.io/projected/0f97d607-4cf4-4c31-85eb-462554b18b34-kube-api-access-4lr7d\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.461570 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.461580 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.461589 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f97d607-4cf4-4c31-85eb-462554b18b34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.684433 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6"] Oct 09 20:14:17 crc kubenswrapper[4907]: I1009 20:14:17.843521 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.124545 4907 scope.go:117] "RemoveContainer" containerID="377ede7bbce4a592098c5a202b7c32a4c9639ca9717fcb6cf82c131db21ce35f" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.124640 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.128982 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89bdfac5-05c6-427c-bf5e-786017f9dd26","Type":"ContainerStarted","Data":"d8014dfb4072d7bdc78de8a37fe54a6087eb2676c9fb838d3f037892f8f5ec2e"} Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.130329 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0f97d607-4cf4-4c31-85eb-462554b18b34" podUID="cd3f8f7d-d0f5-4719-a490-d823cf3c8b23" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.132321 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" event={"ID":"584227b0-c217-4ec6-81fc-195bf4da68f3","Type":"ContainerStarted","Data":"10d14267ef588fd017027ca5c3cd5ccf0730fb40724abd9958a0626000c14fef"} Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.152845 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0f97d607-4cf4-4c31-85eb-462554b18b34" podUID="cd3f8f7d-d0f5-4719-a490-d823cf3c8b23" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.586578 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.590299 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.592942 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-sv5rk" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.593292 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.593298 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.593457 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.692895 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-log-httpd\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.693087 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.693133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-config-data\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.693159 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-scripts\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.693221 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-run-httpd\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.693285 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlvm\" (UniqueName: \"kubernetes.io/projected/f94f166b-aee9-436b-9ced-297ca8cdc96a-kube-api-access-vjlvm\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.794747 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.794803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-config-data\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.794830 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-scripts\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.794867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-run-httpd\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.794910 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlvm\" (UniqueName: \"kubernetes.io/projected/f94f166b-aee9-436b-9ced-297ca8cdc96a-kube-api-access-vjlvm\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.794938 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-log-httpd\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.795348 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-log-httpd\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.795867 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-run-httpd\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.801156 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.801551 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-config-data\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.803396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-scripts\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.819816 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlvm\" (UniqueName: \"kubernetes.io/projected/f94f166b-aee9-436b-9ced-297ca8cdc96a-kube-api-access-vjlvm\") pod \"ceilometer-0\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " pod="openstack/ceilometer-0" Oct 09 20:14:18 crc kubenswrapper[4907]: I1009 20:14:18.963614 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 20:14:19 crc kubenswrapper[4907]: I1009 20:14:19.144518 4907 generic.go:334] "Generic (PLEG): container finished" podID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerID="55ce2628ce760bf1a0322ebee789c7c78f6a136ddd85a1b691560f21ca6df1cb" exitCode=0 Oct 09 20:14:19 crc kubenswrapper[4907]: I1009 20:14:19.144592 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" event={"ID":"7b681ab0-8c28-47ed-9f9c-77f233a4ad91","Type":"ContainerDied","Data":"55ce2628ce760bf1a0322ebee789c7c78f6a136ddd85a1b691560f21ca6df1cb"} Oct 09 20:14:19 crc kubenswrapper[4907]: I1009 20:14:19.166321 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f97d607-4cf4-4c31-85eb-462554b18b34" path="/var/lib/kubelet/pods/0f97d607-4cf4-4c31-85eb-462554b18b34/volumes" Oct 09 20:14:19 crc kubenswrapper[4907]: I1009 20:14:19.810161 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:14:19 crc kubenswrapper[4907]: W1009 20:14:19.815199 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf94f166b_aee9_436b_9ced_297ca8cdc96a.slice/crio-cf9106594443b9b6d5770677cc2d5ff9eb010373420d17c1f66bf52228a42e07 WatchSource:0}: Error finding container cf9106594443b9b6d5770677cc2d5ff9eb010373420d17c1f66bf52228a42e07: Status 404 returned error can't find the container with id cf9106594443b9b6d5770677cc2d5ff9eb010373420d17c1f66bf52228a42e07 Oct 09 20:14:20 crc kubenswrapper[4907]: I1009 20:14:20.162147 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerStarted","Data":"cf9106594443b9b6d5770677cc2d5ff9eb010373420d17c1f66bf52228a42e07"} Oct 09 20:14:20 crc kubenswrapper[4907]: I1009 20:14:20.166020 4907 generic.go:334] "Generic (PLEG): container finished" podID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerID="2cefe480d2e7e6ed768f1f90c24f0b05871c71af592db35225dcc51b8aed3277" exitCode=0 Oct 09 20:14:20 crc kubenswrapper[4907]: I1009 20:14:20.166074 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" event={"ID":"7b681ab0-8c28-47ed-9f9c-77f233a4ad91","Type":"ContainerDied","Data":"2cefe480d2e7e6ed768f1f90c24f0b05871c71af592db35225dcc51b8aed3277"} Oct 09 20:14:21 crc kubenswrapper[4907]: I1009 20:14:21.175848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerStarted","Data":"f48d9e50a44d366055ca54facb6090a89759038d0919ff38c294907c7a9a275c"} Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.191577 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" event={"ID":"7b681ab0-8c28-47ed-9f9c-77f233a4ad91","Type":"ContainerDied","Data":"324072ee32b75522285285a0699ad44fbf248092b5fc508c4d7638b51630b92f"} Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.191619 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="324072ee32b75522285285a0699ad44fbf248092b5fc508c4d7638b51630b92f" Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.259041 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.386449 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-bundle\") pod \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.386582 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsl2x\" (UniqueName: \"kubernetes.io/projected/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-kube-api-access-zsl2x\") pod \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.386628 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-util\") pod \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\" (UID: \"7b681ab0-8c28-47ed-9f9c-77f233a4ad91\") " Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.387734 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-bundle" (OuterVolumeSpecName: "bundle") pod "7b681ab0-8c28-47ed-9f9c-77f233a4ad91" (UID: "7b681ab0-8c28-47ed-9f9c-77f233a4ad91"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.399752 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-kube-api-access-zsl2x" (OuterVolumeSpecName: "kube-api-access-zsl2x") pod "7b681ab0-8c28-47ed-9f9c-77f233a4ad91" (UID: "7b681ab0-8c28-47ed-9f9c-77f233a4ad91"). InnerVolumeSpecName "kube-api-access-zsl2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.489205 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.489247 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsl2x\" (UniqueName: \"kubernetes.io/projected/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-kube-api-access-zsl2x\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.510662 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-util" (OuterVolumeSpecName: "util") pod "7b681ab0-8c28-47ed-9f9c-77f233a4ad91" (UID: "7b681ab0-8c28-47ed-9f9c-77f233a4ad91"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:14:22 crc kubenswrapper[4907]: I1009 20:14:22.590721 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b681ab0-8c28-47ed-9f9c-77f233a4ad91-util\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:23 crc kubenswrapper[4907]: I1009 20:14:23.203511 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" event={"ID":"584227b0-c217-4ec6-81fc-195bf4da68f3","Type":"ContainerStarted","Data":"f3f0d0a158a8843e8105f0e0bdfec630c2bbc697d0b56fbed62e7478402c10ef"} Oct 09 20:14:23 crc kubenswrapper[4907]: I1009 20:14:23.205616 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee","Type":"ContainerStarted","Data":"9c8e0c62ca7539557059eaad9639272866c8446d74ef9fff1e0211c9765313fe"} Oct 09 20:14:23 crc kubenswrapper[4907]: I1009 20:14:23.206652 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89bdfac5-05c6-427c-bf5e-786017f9dd26","Type":"ContainerStarted","Data":"7e5b3a0d902d76f35f9d64acedfa490a6c80b9d3f8eafc678e4c7b21a5c986cb"} Oct 09 20:14:23 crc kubenswrapper[4907]: I1009 20:14:23.211412 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf" Oct 09 20:14:23 crc kubenswrapper[4907]: I1009 20:14:23.211421 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerStarted","Data":"2211b364fea9c948032728c78597a6a3f9a058fe6967d914af4c36ac26ba0e56"} Oct 09 20:14:24 crc kubenswrapper[4907]: I1009 20:14:24.221864 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerStarted","Data":"1725b96181c6e970bfd9df43acd134327389c5592a6be148c16b2d384f29700f"} Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.095314 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6"] Oct 09 20:14:27 crc kubenswrapper[4907]: E1009 20:14:27.096371 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerName="util" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.096393 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerName="util" Oct 09 20:14:27 crc kubenswrapper[4907]: E1009 20:14:27.096428 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerName="pull" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.096436 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerName="pull" Oct 09 20:14:27 crc kubenswrapper[4907]: E1009 20:14:27.096457 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerName="extract" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.096490 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerName="extract" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.097062 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b681ab0-8c28-47ed-9f9c-77f233a4ad91" containerName="extract" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.097808 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.099941 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.099960 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.100357 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-q7stn" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.100685 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.101320 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.111680 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.205405 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.205572 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5wj6\" (UniqueName: \"kubernetes.io/projected/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-kube-api-access-p5wj6\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.205622 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.205678 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.205785 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.307206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5wj6\" (UniqueName: \"kubernetes.io/projected/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-kube-api-access-p5wj6\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.307285 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.307337 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.307442 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.307532 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.308822 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.309790 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-config\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.313615 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.317489 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.318563 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.321547 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.332327 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5wj6\" (UniqueName: \"kubernetes.io/projected/be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3-kube-api-access-p5wj6\") pod \"cloudkitty-lokistack-distributor-56cd74f89f-gl9k6\" (UID: \"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3\") " pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.344505 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.344545 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.344740 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-loki-s3" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.361612 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.436408 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.450255 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.457137 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.462958 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.463151 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.480690 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.510593 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-config\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.510656 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.510801 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n5vt\" (UniqueName: \"kubernetes.io/projected/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-kube-api-access-2n5vt\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.510864 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-logging-loki-s3\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.510957 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.510997 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.557500 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.560061 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.574508 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.574711 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.574852 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.574999 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.575151 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.577578 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.615146 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.615196 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.615226 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2dk9\" (UniqueName: \"kubernetes.io/projected/dc87ab00-6151-4b9a-828b-b7fab2987f4e-kube-api-access-s2dk9\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.615244 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.615284 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.615324 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616213 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616289 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616355 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp6dd\" (UniqueName: \"kubernetes.io/projected/11282a94-310a-44d0-8edd-8a49d8050096-kube-api-access-gp6dd\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616602 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11282a94-310a-44d0-8edd-8a49d8050096-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616669 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616739 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-config\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616789 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616879 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n5vt\" (UniqueName: \"kubernetes.io/projected/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-kube-api-access-2n5vt\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616898 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.616958 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-logging-loki-s3\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.617001 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.617024 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.617052 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.617080 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.617869 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-config\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.621592 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-logging-loki-s3\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.624896 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.627416 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.636637 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.636884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n5vt\" (UniqueName: \"kubernetes.io/projected/8591c06c-4ad0-41b1-b62f-ea21f97f50a4-kube-api-access-2n5vt\") pod \"cloudkitty-lokistack-querier-68bbd7984c-gtcgb\" (UID: \"8591c06c-4ad0-41b1-b62f-ea21f97f50a4\") " pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.645143 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.646681 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.650836 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-nt24k" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.679516 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99"] Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.717962 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718240 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718281 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718311 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp6dd\" (UniqueName: \"kubernetes.io/projected/11282a94-310a-44d0-8edd-8a49d8050096-kube-api-access-gp6dd\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718337 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718415 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718486 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718511 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11282a94-310a-44d0-8edd-8a49d8050096-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718535 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718566 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.718636 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.722053 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: E1009 20:14:27.725854 4907 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Oct 09 20:14:27 crc kubenswrapper[4907]: E1009 20:14:27.725913 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tls-secret podName:dc87ab00-6151-4b9a-828b-b7fab2987f4e nodeName:}" failed. No retries permitted until 2025-10-09 20:14:28.225899104 +0000 UTC m=+2753.757866593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tls-secret") pod "cloudkitty-lokistack-gateway-76cc998948-n4njd" (UID: "dc87ab00-6151-4b9a-828b-b7fab2987f4e") : secret "cloudkitty-lokistack-gateway-http" not found Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.726078 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj4jn\" (UniqueName: \"kubernetes.io/projected/d03409bd-dfae-4397-bd24-55c925ce4d25-kube-api-access-mj4jn\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.726799 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.726858 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11282a94-310a-44d0-8edd-8a49d8050096-config\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.726889 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.726910 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.726930 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.726917 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.726950 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.727002 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.727036 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.727153 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.727185 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.727205 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dk9\" (UniqueName: \"kubernetes.io/projected/dc87ab00-6151-4b9a-828b-b7fab2987f4e-kube-api-access-s2dk9\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.727224 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.727258 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.727282 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.728120 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.728252 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.729111 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.732086 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/11282a94-310a-44d0-8edd-8a49d8050096-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.734476 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.735906 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.762375 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.767278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp6dd\" (UniqueName: \"kubernetes.io/projected/11282a94-310a-44d0-8edd-8a49d8050096-kube-api-access-gp6dd\") pod \"cloudkitty-lokistack-query-frontend-779849886d-s66fd\" (UID: \"11282a94-310a-44d0-8edd-8a49d8050096\") " pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.776601 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dk9\" (UniqueName: \"kubernetes.io/projected/dc87ab00-6151-4b9a-828b-b7fab2987f4e-kube-api-access-s2dk9\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.821381 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.830639 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.830776 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.830861 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj4jn\" (UniqueName: \"kubernetes.io/projected/d03409bd-dfae-4397-bd24-55c925ce4d25-kube-api-access-mj4jn\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.830906 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.831005 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.831040 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.831073 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.831108 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.831135 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.832404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.833835 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: E1009 20:14:27.833915 4907 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Oct 09 20:14:27 crc kubenswrapper[4907]: E1009 20:14:27.833963 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tls-secret podName:d03409bd-dfae-4397-bd24-55c925ce4d25 nodeName:}" failed. No retries permitted until 2025-10-09 20:14:28.33394761 +0000 UTC m=+2753.865915099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tls-secret") pod "cloudkitty-lokistack-gateway-76cc998948-d5l99" (UID: "d03409bd-dfae-4397-bd24-55c925ce4d25") : secret "cloudkitty-lokistack-gateway-http" not found Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.835199 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.836863 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.837110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-rbac\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.840092 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tenants\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.842801 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d03409bd-dfae-4397-bd24-55c925ce4d25-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:27 crc kubenswrapper[4907]: I1009 20:14:27.853606 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj4jn\" (UniqueName: \"kubernetes.io/projected/d03409bd-dfae-4397-bd24-55c925ce4d25-kube-api-access-mj4jn\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.243740 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.252729 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/dc87ab00-6151-4b9a-828b-b7fab2987f4e-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-n4njd\" (UID: \"dc87ab00-6151-4b9a-828b-b7fab2987f4e\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.294538 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.301054 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.303705 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.309415 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.345784 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.353058 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.366127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d03409bd-dfae-4397-bd24-55c925ce4d25-tls-secret\") pod \"cloudkitty-lokistack-gateway-76cc998948-d5l99\" (UID: \"d03409bd-dfae-4397-bd24-55c925ce4d25\") " pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.396966 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.428717 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.429973 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.438993 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.439224 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.457841 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9d9a557d-f8b4-4941-bcc3-2c9474b2078f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d9a557d-f8b4-4941-bcc3-2c9474b2078f\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.457914 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnctr\" (UniqueName: \"kubernetes.io/projected/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-kube-api-access-bnctr\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.458003 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.458047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.458065 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-logging-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.458081 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.458114 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.458137 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2f4970aa-1f30-4a27-91c6-66b734fb98ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f4970aa-1f30-4a27-91c6-66b734fb98ce\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.495035 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.504545 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559642 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559704 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559728 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559796 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-logging-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559813 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559841 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-logging-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559869 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2f4970aa-1f30-4a27-91c6-66b734fb98ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f4970aa-1f30-4a27-91c6-66b734fb98ce\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559909 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559944 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9d9a557d-f8b4-4941-bcc3-2c9474b2078f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d9a557d-f8b4-4941-bcc3-2c9474b2078f\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.559965 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7afc8b1a-d8d8-44af-a252-c5164d4c2113\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7afc8b1a-d8d8-44af-a252-c5164d4c2113\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.560000 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/246ca210-2f65-4612-a7ac-dc4e206dd6f0-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.560019 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnctr\" (UniqueName: \"kubernetes.io/projected/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-kube-api-access-bnctr\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.560038 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn4fk\" (UniqueName: \"kubernetes.io/projected/246ca210-2f65-4612-a7ac-dc4e206dd6f0-kube-api-access-xn4fk\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.560803 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.561420 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.563293 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.563339 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9d9a557d-f8b4-4941-bcc3-2c9474b2078f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d9a557d-f8b4-4941-bcc3-2c9474b2078f\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3d2bd6b35e195925d62bff190499de88ca094787b3872fde9bf50849affe3b08/globalmount\"" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.563360 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.563385 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2f4970aa-1f30-4a27-91c6-66b734fb98ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f4970aa-1f30-4a27-91c6-66b734fb98ce\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66d846bd7fbb2ecc4d4cc06f8a98a4cb13e942c656247dbc91deb3303311883d/globalmount\"" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.565155 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.565515 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-logging-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.570548 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.571865 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.574341 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.577199 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.581605 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.581790 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.647234 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnctr\" (UniqueName: \"kubernetes.io/projected/bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0-kube-api-access-bnctr\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.648063 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2f4970aa-1f30-4a27-91c6-66b734fb98ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2f4970aa-1f30-4a27-91c6-66b734fb98ce\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.649039 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9d9a557d-f8b4-4941-bcc3-2c9474b2078f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d9a557d-f8b4-4941-bcc3-2c9474b2078f\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0\") " pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664543 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1d45fa-edcc-4ab0-a435-26fce79f5607-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664625 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-logging-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664681 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664745 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-logging-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664804 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7afc8b1a-d8d8-44af-a252-c5164d4c2113\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7afc8b1a-d8d8-44af-a252-c5164d4c2113\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664860 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75vfh\" (UniqueName: \"kubernetes.io/projected/2f1d45fa-edcc-4ab0-a435-26fce79f5607-kube-api-access-75vfh\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664885 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664919 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/246ca210-2f65-4612-a7ac-dc4e206dd6f0-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.664964 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.665003 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn4fk\" (UniqueName: \"kubernetes.io/projected/246ca210-2f65-4612-a7ac-dc4e206dd6f0-kube-api-access-xn4fk\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.665052 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.665083 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-134ffba7-fae2-4c16-8182-12af9e235404\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-134ffba7-fae2-4c16-8182-12af9e235404\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.665151 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.665216 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.666454 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.670801 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/246ca210-2f65-4612-a7ac-dc4e206dd6f0-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.674310 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-logging-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.681491 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.689418 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/246ca210-2f65-4612-a7ac-dc4e206dd6f0-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.699515 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.699565 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7afc8b1a-d8d8-44af-a252-c5164d4c2113\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7afc8b1a-d8d8-44af-a252-c5164d4c2113\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e69fbd2064ca7756d4eab6999e090d7c15e144226022353b6a9f4eab34fd2b5c/globalmount\"" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.700551 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn4fk\" (UniqueName: \"kubernetes.io/projected/246ca210-2f65-4612-a7ac-dc4e206dd6f0-kube-api-access-xn4fk\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.717439 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.750011 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7afc8b1a-d8d8-44af-a252-c5164d4c2113\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7afc8b1a-d8d8-44af-a252-c5164d4c2113\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"246ca210-2f65-4612-a7ac-dc4e206dd6f0\") " pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.767041 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1d45fa-edcc-4ab0-a435-26fce79f5607-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.767114 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-logging-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.767158 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75vfh\" (UniqueName: \"kubernetes.io/projected/2f1d45fa-edcc-4ab0-a435-26fce79f5607-kube-api-access-75vfh\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.767199 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.767224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.767279 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.767305 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-134ffba7-fae2-4c16-8182-12af9e235404\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-134ffba7-fae2-4c16-8182-12af9e235404\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.768815 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1d45fa-edcc-4ab0-a435-26fce79f5607-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.768921 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.771797 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-logging-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.772238 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.772775 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.772802 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-134ffba7-fae2-4c16-8182-12af9e235404\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-134ffba7-fae2-4c16-8182-12af9e235404\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/52daaa2af1fcd5f2fa7ddf27369dd13e3a350d7b4d40903c91b783f88c80f061/globalmount\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.778376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/2f1d45fa-edcc-4ab0-a435-26fce79f5607-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.779080 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.782721 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75vfh\" (UniqueName: \"kubernetes.io/projected/2f1d45fa-edcc-4ab0-a435-26fce79f5607-kube-api-access-75vfh\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:28 crc kubenswrapper[4907]: I1009 20:14:28.819759 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-134ffba7-fae2-4c16-8182-12af9e235404\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-134ffba7-fae2-4c16-8182-12af9e235404\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"2f1d45fa-edcc-4ab0-a435-26fce79f5607\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:29 crc kubenswrapper[4907]: I1009 20:14:29.033154 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:29 crc kubenswrapper[4907]: I1009 20:14:29.339498 4907 generic.go:334] "Generic (PLEG): container finished" podID="89bdfac5-05c6-427c-bf5e-786017f9dd26" containerID="7e5b3a0d902d76f35f9d64acedfa490a6c80b9d3f8eafc678e4c7b21a5c986cb" exitCode=0 Oct 09 20:14:29 crc kubenswrapper[4907]: I1009 20:14:29.339554 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89bdfac5-05c6-427c-bf5e-786017f9dd26","Type":"ContainerDied","Data":"7e5b3a0d902d76f35f9d64acedfa490a6c80b9d3f8eafc678e4c7b21a5c986cb"} Oct 09 20:14:29 crc kubenswrapper[4907]: I1009 20:14:29.354849 4907 generic.go:334] "Generic (PLEG): container finished" podID="2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee" containerID="9c8e0c62ca7539557059eaad9639272866c8446d74ef9fff1e0211c9765313fe" exitCode=0 Oct 09 20:14:29 crc kubenswrapper[4907]: I1009 20:14:29.354889 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee","Type":"ContainerDied","Data":"9c8e0c62ca7539557059eaad9639272866c8446d74ef9fff1e0211c9765313fe"} Oct 09 20:14:34 crc kubenswrapper[4907]: I1009 20:14:34.690599 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Oct 09 20:14:34 crc kubenswrapper[4907]: I1009 20:14:34.847947 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6"] Oct 09 20:14:36 crc kubenswrapper[4907]: I1009 20:14:36.475890 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"2f1d45fa-edcc-4ab0-a435-26fce79f5607","Type":"ContainerStarted","Data":"6c879fe4421b76c2d3f2f45e9da55ccb41d3a6b5c3c1ecea1aeecf2f0305b641"} Oct 09 20:14:36 crc kubenswrapper[4907]: I1009 20:14:36.477772 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" event={"ID":"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3","Type":"ContainerStarted","Data":"d307892bdf4be69f967c86c1fec5bbad130d5ebf666e67a9f00de0c93377ee28"} Oct 09 20:14:36 crc kubenswrapper[4907]: I1009 20:14:36.665300 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd"] Oct 09 20:14:36 crc kubenswrapper[4907]: I1009 20:14:36.674604 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd"] Oct 09 20:14:36 crc kubenswrapper[4907]: I1009 20:14:36.797216 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Oct 09 20:14:36 crc kubenswrapper[4907]: I1009 20:14:36.810873 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Oct 09 20:14:36 crc kubenswrapper[4907]: I1009 20:14:36.947611 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99"] Oct 09 20:14:38 crc kubenswrapper[4907]: W1009 20:14:38.654780 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc87ab00_6151_4b9a_828b_b7fab2987f4e.slice/crio-6b6a4b75fc02fc350e60f7c0971b80d658ac5cd5278b45d73a3d417ff259adb2 WatchSource:0}: Error finding container 6b6a4b75fc02fc350e60f7c0971b80d658ac5cd5278b45d73a3d417ff259adb2: Status 404 returned error can't find the container with id 6b6a4b75fc02fc350e60f7c0971b80d658ac5cd5278b45d73a3d417ff259adb2 Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.142561 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb"] Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.504387 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" event={"ID":"d03409bd-dfae-4397-bd24-55c925ce4d25","Type":"ContainerStarted","Data":"1031ff77be3bb9be704694dd8d532bc213c7d7a255da1c0639c2b939cb62e94e"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.511151 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" event={"ID":"584227b0-c217-4ec6-81fc-195bf4da68f3","Type":"ContainerStarted","Data":"d6e23de83c49d5c3027243cdb5133daa0e45ba02fdb6ae0a2f2743e8bcf11d78"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.511650 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.513035 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" event={"ID":"8591c06c-4ad0-41b1-b62f-ea21f97f50a4","Type":"ContainerStarted","Data":"f8c7abd271ac92a50dbc449dfefc11eb7aa87708d027d704e4d7cdc7e80d9225"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.516746 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.518216 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee","Type":"ContainerStarted","Data":"ab2eba7d5d5527737ec48cc8e1e523184d2c7e06ec574cda7a8b7bae0325669a"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.522002 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" event={"ID":"11282a94-310a-44d0-8edd-8a49d8050096","Type":"ContainerStarted","Data":"5231ae0808a1310904c827e44eba8ab870c45ab39d5ff519bef1122e4cc1efac"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.525320 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0","Type":"ContainerStarted","Data":"23371cb35d4a3925ce2235ae5a0b18032b57722fd754fb759b53e0eb845eebf1"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.528144 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" event={"ID":"dc87ab00-6151-4b9a-828b-b7fab2987f4e","Type":"ContainerStarted","Data":"6b6a4b75fc02fc350e60f7c0971b80d658ac5cd5278b45d73a3d417ff259adb2"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.532495 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89bdfac5-05c6-427c-bf5e-786017f9dd26","Type":"ContainerStarted","Data":"f21573ad066a419d5eccdcb9eaf4fd73883e5d18f1e746c74f19ae55735c2294"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.571187 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerStarted","Data":"d33783f2bbe5ebf346dd7b6037dd0bf920df86595c40c175ce1c769027602c9b"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.574609 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.598093 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"246ca210-2f65-4612-a7ac-dc4e206dd6f0","Type":"ContainerStarted","Data":"a5176e12055a0c6caf9a5b5d4d460b192ad0235ee42ff4c3ab80eab02a6cadb3"} Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.599052 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-6978c7c7cf-jnfh6" podStartSLOduration=2.519134915 podStartE2EDuration="23.599020279s" podCreationTimestamp="2025-10-09 20:14:16 +0000 UTC" firstStartedPulling="2025-10-09 20:14:17.712592492 +0000 UTC m=+2743.244559981" lastFinishedPulling="2025-10-09 20:14:38.792477856 +0000 UTC m=+2764.324445345" observedRunningTime="2025-10-09 20:14:39.53532873 +0000 UTC m=+2765.067296229" watchObservedRunningTime="2025-10-09 20:14:39.599020279 +0000 UTC m=+2765.130987778" Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.664445 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.122689666 podStartE2EDuration="21.664419091s" podCreationTimestamp="2025-10-09 20:14:18 +0000 UTC" firstStartedPulling="2025-10-09 20:14:19.817383498 +0000 UTC m=+2745.349350987" lastFinishedPulling="2025-10-09 20:14:34.359112923 +0000 UTC m=+2759.891080412" observedRunningTime="2025-10-09 20:14:39.633775296 +0000 UTC m=+2765.165742795" watchObservedRunningTime="2025-10-09 20:14:39.664419091 +0000 UTC m=+2765.196386600" Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.742610 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-lb67g"] Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.744037 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-lb67g" Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.752228 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-lb67g"] Oct 09 20:14:39 crc kubenswrapper[4907]: I1009 20:14:39.945803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87l9s\" (UniqueName: \"kubernetes.io/projected/e424b2b3-47b9-4089-bef8-24998fb2d49e-kube-api-access-87l9s\") pod \"cloudkitty-db-create-lb67g\" (UID: \"e424b2b3-47b9-4089-bef8-24998fb2d49e\") " pod="openstack/cloudkitty-db-create-lb67g" Oct 09 20:14:40 crc kubenswrapper[4907]: I1009 20:14:40.047846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87l9s\" (UniqueName: \"kubernetes.io/projected/e424b2b3-47b9-4089-bef8-24998fb2d49e-kube-api-access-87l9s\") pod \"cloudkitty-db-create-lb67g\" (UID: \"e424b2b3-47b9-4089-bef8-24998fb2d49e\") " pod="openstack/cloudkitty-db-create-lb67g" Oct 09 20:14:40 crc kubenswrapper[4907]: I1009 20:14:40.068760 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87l9s\" (UniqueName: \"kubernetes.io/projected/e424b2b3-47b9-4089-bef8-24998fb2d49e-kube-api-access-87l9s\") pod \"cloudkitty-db-create-lb67g\" (UID: \"e424b2b3-47b9-4089-bef8-24998fb2d49e\") " pod="openstack/cloudkitty-db-create-lb67g" Oct 09 20:14:40 crc kubenswrapper[4907]: I1009 20:14:40.367327 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-lb67g" Oct 09 20:14:41 crc kubenswrapper[4907]: W1009 20:14:41.029066 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode424b2b3_47b9_4089_bef8_24998fb2d49e.slice/crio-19afe9055256a73d67617431419def44625ce8cd551c0c2c84ff52893db7e7cd WatchSource:0}: Error finding container 19afe9055256a73d67617431419def44625ce8cd551c0c2c84ff52893db7e7cd: Status 404 returned error can't find the container with id 19afe9055256a73d67617431419def44625ce8cd551c0c2c84ff52893db7e7cd Oct 09 20:14:41 crc kubenswrapper[4907]: I1009 20:14:41.031246 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-lb67g"] Oct 09 20:14:41 crc kubenswrapper[4907]: I1009 20:14:41.621623 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-lb67g" event={"ID":"e424b2b3-47b9-4089-bef8-24998fb2d49e","Type":"ContainerStarted","Data":"19afe9055256a73d67617431419def44625ce8cd551c0c2c84ff52893db7e7cd"} Oct 09 20:14:42 crc kubenswrapper[4907]: I1009 20:14:42.635642 4907 generic.go:334] "Generic (PLEG): container finished" podID="e424b2b3-47b9-4089-bef8-24998fb2d49e" containerID="ec815aa94e48ee18c7125a8aef8d78e430f8636705aa00a71429bddf8acbe56e" exitCode=0 Oct 09 20:14:42 crc kubenswrapper[4907]: I1009 20:14:42.636106 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-lb67g" event={"ID":"e424b2b3-47b9-4089-bef8-24998fb2d49e","Type":"ContainerDied","Data":"ec815aa94e48ee18c7125a8aef8d78e430f8636705aa00a71429bddf8acbe56e"} Oct 09 20:14:42 crc kubenswrapper[4907]: I1009 20:14:42.644750 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee","Type":"ContainerStarted","Data":"f121bc4eefc9efcbce133271aec3e5c0c699ea63adc7947714e4db923bc58a6f"} Oct 09 20:14:42 crc kubenswrapper[4907]: I1009 20:14:42.645179 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:42 crc kubenswrapper[4907]: I1009 20:14:42.653888 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 09 20:14:42 crc kubenswrapper[4907]: I1009 20:14:42.659788 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89bdfac5-05c6-427c-bf5e-786017f9dd26","Type":"ContainerStarted","Data":"86a7690a2fdb01e7e3b98df2158a2451426c60e329679eb37d5afa3c6d3d9912"} Oct 09 20:14:42 crc kubenswrapper[4907]: I1009 20:14:42.686759 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.904040689 podStartE2EDuration="27.686740518s" podCreationTimestamp="2025-10-09 20:14:15 +0000 UTC" firstStartedPulling="2025-10-09 20:14:16.895009803 +0000 UTC m=+2742.426977292" lastFinishedPulling="2025-10-09 20:14:38.677709622 +0000 UTC m=+2764.209677121" observedRunningTime="2025-10-09 20:14:42.678758479 +0000 UTC m=+2768.210725978" watchObservedRunningTime="2025-10-09 20:14:42.686740518 +0000 UTC m=+2768.218708007" Oct 09 20:14:44 crc kubenswrapper[4907]: I1009 20:14:44.633532 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-lb67g" Oct 09 20:14:44 crc kubenswrapper[4907]: I1009 20:14:44.704700 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-lb67g" Oct 09 20:14:44 crc kubenswrapper[4907]: I1009 20:14:44.704876 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-lb67g" event={"ID":"e424b2b3-47b9-4089-bef8-24998fb2d49e","Type":"ContainerDied","Data":"19afe9055256a73d67617431419def44625ce8cd551c0c2c84ff52893db7e7cd"} Oct 09 20:14:44 crc kubenswrapper[4907]: I1009 20:14:44.704902 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19afe9055256a73d67617431419def44625ce8cd551c0c2c84ff52893db7e7cd" Oct 09 20:14:44 crc kubenswrapper[4907]: I1009 20:14:44.793517 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87l9s\" (UniqueName: \"kubernetes.io/projected/e424b2b3-47b9-4089-bef8-24998fb2d49e-kube-api-access-87l9s\") pod \"e424b2b3-47b9-4089-bef8-24998fb2d49e\" (UID: \"e424b2b3-47b9-4089-bef8-24998fb2d49e\") " Oct 09 20:14:44 crc kubenswrapper[4907]: I1009 20:14:44.818805 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e424b2b3-47b9-4089-bef8-24998fb2d49e-kube-api-access-87l9s" (OuterVolumeSpecName: "kube-api-access-87l9s") pod "e424b2b3-47b9-4089-bef8-24998fb2d49e" (UID: "e424b2b3-47b9-4089-bef8-24998fb2d49e"). InnerVolumeSpecName "kube-api-access-87l9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:14:44 crc kubenswrapper[4907]: I1009 20:14:44.896742 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87l9s\" (UniqueName: \"kubernetes.io/projected/e424b2b3-47b9-4089-bef8-24998fb2d49e-kube-api-access-87l9s\") on node \"crc\" DevicePath \"\"" Oct 09 20:14:46 crc kubenswrapper[4907]: I1009 20:14:46.730652 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" event={"ID":"d03409bd-dfae-4397-bd24-55c925ce4d25","Type":"ContainerStarted","Data":"d05a4fd9bfa9b9dcf0969a58dd41a2be871a21879de28c619d487610cac48da6"} Oct 09 20:14:46 crc kubenswrapper[4907]: I1009 20:14:46.731008 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:46 crc kubenswrapper[4907]: I1009 20:14:46.732614 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"246ca210-2f65-4612-a7ac-dc4e206dd6f0","Type":"ContainerStarted","Data":"6445807817c4e260e6cd8a486a6cd8c98503f3ff606cf1c0f08d5175aff30795"} Oct 09 20:14:46 crc kubenswrapper[4907]: I1009 20:14:46.733089 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:14:46 crc kubenswrapper[4907]: I1009 20:14:46.761596 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" Oct 09 20:14:46 crc kubenswrapper[4907]: I1009 20:14:46.779787 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-d5l99" podStartSLOduration=12.660927974 podStartE2EDuration="19.779770839s" podCreationTimestamp="2025-10-09 20:14:27 +0000 UTC" firstStartedPulling="2025-10-09 20:14:38.739495604 +0000 UTC m=+2764.271463093" lastFinishedPulling="2025-10-09 20:14:45.858338469 +0000 UTC m=+2771.390305958" observedRunningTime="2025-10-09 20:14:46.75774947 +0000 UTC m=+2772.289716969" watchObservedRunningTime="2025-10-09 20:14:46.779770839 +0000 UTC m=+2772.311738328" Oct 09 20:14:46 crc kubenswrapper[4907]: I1009 20:14:46.798037 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=12.707832005 podStartE2EDuration="19.798015904s" podCreationTimestamp="2025-10-09 20:14:27 +0000 UTC" firstStartedPulling="2025-10-09 20:14:38.747813062 +0000 UTC m=+2764.279780551" lastFinishedPulling="2025-10-09 20:14:45.837996961 +0000 UTC m=+2771.369964450" observedRunningTime="2025-10-09 20:14:46.789052031 +0000 UTC m=+2772.321019520" watchObservedRunningTime="2025-10-09 20:14:46.798015904 +0000 UTC m=+2772.329983393" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.753895 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" event={"ID":"be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3","Type":"ContainerStarted","Data":"d3503f76dacdec8b622fe610d92ea42011d0755c7785be0a0e625c03b5afbc95"} Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.754664 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.756792 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" event={"ID":"8591c06c-4ad0-41b1-b62f-ea21f97f50a4","Type":"ContainerStarted","Data":"96e20f971a780aa928723f18be235848a996052f80471fb132c60cb6f5d93115"} Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.756939 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.758762 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" event={"ID":"11282a94-310a-44d0-8edd-8a49d8050096","Type":"ContainerStarted","Data":"4e0d54ac88ca0bf0f2f8ed758e8cf3b0cc2b9b26d60a3872b433041eb3441cda"} Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.758836 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.763323 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89bdfac5-05c6-427c-bf5e-786017f9dd26","Type":"ContainerStarted","Data":"fea90546aa59aaf84ae7a85ad1c22ea54a5d300ad16d06f2f5f8868fb51e4bb9"} Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.766352 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0","Type":"ContainerStarted","Data":"d222a9a9d0fe35eb931a04ae7e56cd4a12caeab22b0e9121692f19f34eb76eaf"} Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.767020 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.769022 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"2f1d45fa-edcc-4ab0-a435-26fce79f5607","Type":"ContainerStarted","Data":"34010824de45fa385978c27dbc570d0c849bfb923dff95f7b0c0d91ceb9918a7"} Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.769146 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.771692 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" event={"ID":"dc87ab00-6151-4b9a-828b-b7fab2987f4e","Type":"ContainerStarted","Data":"19b94b58adc3bce7e9ddd36ff5192423704a27d437c461dbd721d756a6031925"} Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.781212 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" podStartSLOduration=12.016560527 podStartE2EDuration="21.781192894s" podCreationTimestamp="2025-10-09 20:14:27 +0000 UTC" firstStartedPulling="2025-10-09 20:14:36.147061413 +0000 UTC m=+2761.679028902" lastFinishedPulling="2025-10-09 20:14:45.91169378 +0000 UTC m=+2771.443661269" observedRunningTime="2025-10-09 20:14:48.775753748 +0000 UTC m=+2774.307721257" watchObservedRunningTime="2025-10-09 20:14:48.781192894 +0000 UTC m=+2774.313160393" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.808831 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" podStartSLOduration=15.073527228 podStartE2EDuration="21.808802273s" podCreationTimestamp="2025-10-09 20:14:27 +0000 UTC" firstStartedPulling="2025-10-09 20:14:39.18982555 +0000 UTC m=+2764.721793039" lastFinishedPulling="2025-10-09 20:14:45.925100585 +0000 UTC m=+2771.457068084" observedRunningTime="2025-10-09 20:14:48.803954422 +0000 UTC m=+2774.335921941" watchObservedRunningTime="2025-10-09 20:14:48.808802273 +0000 UTC m=+2774.340769782" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.833487 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" podStartSLOduration=14.506882991 podStartE2EDuration="21.833447228s" podCreationTimestamp="2025-10-09 20:14:27 +0000 UTC" firstStartedPulling="2025-10-09 20:14:38.656882133 +0000 UTC m=+2764.188849622" lastFinishedPulling="2025-10-09 20:14:45.98344637 +0000 UTC m=+2771.515413859" observedRunningTime="2025-10-09 20:14:48.826682089 +0000 UTC m=+2774.358649598" watchObservedRunningTime="2025-10-09 20:14:48.833447228 +0000 UTC m=+2774.365414717" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.850164 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=14.689424345 podStartE2EDuration="21.850145495s" podCreationTimestamp="2025-10-09 20:14:27 +0000 UTC" firstStartedPulling="2025-10-09 20:14:38.747527394 +0000 UTC m=+2764.279494883" lastFinishedPulling="2025-10-09 20:14:45.908248524 +0000 UTC m=+2771.440216033" observedRunningTime="2025-10-09 20:14:48.847986401 +0000 UTC m=+2774.379953910" watchObservedRunningTime="2025-10-09 20:14:48.850145495 +0000 UTC m=+2774.382112984" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.884849 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" podStartSLOduration=14.63234394 podStartE2EDuration="21.88483101s" podCreationTimestamp="2025-10-09 20:14:27 +0000 UTC" firstStartedPulling="2025-10-09 20:14:38.673650891 +0000 UTC m=+2764.205618380" lastFinishedPulling="2025-10-09 20:14:45.926137961 +0000 UTC m=+2771.458105450" observedRunningTime="2025-10-09 20:14:48.873048746 +0000 UTC m=+2774.405016235" watchObservedRunningTime="2025-10-09 20:14:48.88483101 +0000 UTC m=+2774.416798509" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.926263 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=12.164750454 podStartE2EDuration="21.926230003s" podCreationTimestamp="2025-10-09 20:14:27 +0000 UTC" firstStartedPulling="2025-10-09 20:14:36.147060143 +0000 UTC m=+2761.679027642" lastFinishedPulling="2025-10-09 20:14:45.908539702 +0000 UTC m=+2771.440507191" observedRunningTime="2025-10-09 20:14:48.918810408 +0000 UTC m=+2774.450777897" watchObservedRunningTime="2025-10-09 20:14:48.926230003 +0000 UTC m=+2774.458197492" Oct 09 20:14:48 crc kubenswrapper[4907]: I1009 20:14:48.955608 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.238651887 podStartE2EDuration="33.955592326s" podCreationTimestamp="2025-10-09 20:14:15 +0000 UTC" firstStartedPulling="2025-10-09 20:14:17.857565559 +0000 UTC m=+2743.389533048" lastFinishedPulling="2025-10-09 20:14:47.574506008 +0000 UTC m=+2773.106473487" observedRunningTime="2025-10-09 20:14:48.951671338 +0000 UTC m=+2774.483638857" watchObservedRunningTime="2025-10-09 20:14:48.955592326 +0000 UTC m=+2774.487559815" Oct 09 20:14:49 crc kubenswrapper[4907]: I1009 20:14:49.779492 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:49 crc kubenswrapper[4907]: I1009 20:14:49.802654 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-76cc998948-n4njd" Oct 09 20:14:51 crc kubenswrapper[4907]: I1009 20:14:51.938078 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 09 20:14:59 crc kubenswrapper[4907]: I1009 20:14:59.930087 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-0e83-account-create-54jfg"] Oct 09 20:14:59 crc kubenswrapper[4907]: E1009 20:14:59.932296 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e424b2b3-47b9-4089-bef8-24998fb2d49e" containerName="mariadb-database-create" Oct 09 20:14:59 crc kubenswrapper[4907]: I1009 20:14:59.932383 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e424b2b3-47b9-4089-bef8-24998fb2d49e" containerName="mariadb-database-create" Oct 09 20:14:59 crc kubenswrapper[4907]: I1009 20:14:59.932764 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e424b2b3-47b9-4089-bef8-24998fb2d49e" containerName="mariadb-database-create" Oct 09 20:14:59 crc kubenswrapper[4907]: I1009 20:14:59.933942 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0e83-account-create-54jfg" Oct 09 20:14:59 crc kubenswrapper[4907]: I1009 20:14:59.948941 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-0e83-account-create-54jfg"] Oct 09 20:14:59 crc kubenswrapper[4907]: I1009 20:14:59.994692 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.052702 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8t8n\" (UniqueName: \"kubernetes.io/projected/7f8acc1c-79a2-43fc-9493-5baff3406366-kube-api-access-l8t8n\") pod \"cloudkitty-0e83-account-create-54jfg\" (UID: \"7f8acc1c-79a2-43fc-9493-5baff3406366\") " pod="openstack/cloudkitty-0e83-account-create-54jfg" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.154778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8t8n\" (UniqueName: \"kubernetes.io/projected/7f8acc1c-79a2-43fc-9493-5baff3406366-kube-api-access-l8t8n\") pod \"cloudkitty-0e83-account-create-54jfg\" (UID: \"7f8acc1c-79a2-43fc-9493-5baff3406366\") " pod="openstack/cloudkitty-0e83-account-create-54jfg" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.159541 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t"] Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.161158 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.173764 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.173984 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.178108 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t"] Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.233580 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8t8n\" (UniqueName: \"kubernetes.io/projected/7f8acc1c-79a2-43fc-9493-5baff3406366-kube-api-access-l8t8n\") pod \"cloudkitty-0e83-account-create-54jfg\" (UID: \"7f8acc1c-79a2-43fc-9493-5baff3406366\") " pod="openstack/cloudkitty-0e83-account-create-54jfg" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.256465 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9tjx\" (UniqueName: \"kubernetes.io/projected/d755a8cc-ff77-4c6f-9fde-62a156178521-kube-api-access-q9tjx\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.256732 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d755a8cc-ff77-4c6f-9fde-62a156178521-secret-volume\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.256772 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d755a8cc-ff77-4c6f-9fde-62a156178521-config-volume\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.334168 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0e83-account-create-54jfg" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.359231 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d755a8cc-ff77-4c6f-9fde-62a156178521-secret-volume\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.359298 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d755a8cc-ff77-4c6f-9fde-62a156178521-config-volume\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.359363 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tjx\" (UniqueName: \"kubernetes.io/projected/d755a8cc-ff77-4c6f-9fde-62a156178521-kube-api-access-q9tjx\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.361115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d755a8cc-ff77-4c6f-9fde-62a156178521-config-volume\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.363243 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d755a8cc-ff77-4c6f-9fde-62a156178521-secret-volume\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.379110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tjx\" (UniqueName: \"kubernetes.io/projected/d755a8cc-ff77-4c6f-9fde-62a156178521-kube-api-access-q9tjx\") pod \"collect-profiles-29334015-xx95t\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.498967 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.801213 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-0e83-account-create-54jfg"] Oct 09 20:15:00 crc kubenswrapper[4907]: W1009 20:15:00.803885 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8acc1c_79a2_43fc_9493_5baff3406366.slice/crio-24bd203e63ad4fbd7be58a98003ea44420ac37519f12ce15cf919de9246ffd48 WatchSource:0}: Error finding container 24bd203e63ad4fbd7be58a98003ea44420ac37519f12ce15cf919de9246ffd48: Status 404 returned error can't find the container with id 24bd203e63ad4fbd7be58a98003ea44420ac37519f12ce15cf919de9246ffd48 Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.913077 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0e83-account-create-54jfg" event={"ID":"7f8acc1c-79a2-43fc-9493-5baff3406366","Type":"ContainerStarted","Data":"24bd203e63ad4fbd7be58a98003ea44420ac37519f12ce15cf919de9246ffd48"} Oct 09 20:15:00 crc kubenswrapper[4907]: W1009 20:15:00.965759 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd755a8cc_ff77_4c6f_9fde_62a156178521.slice/crio-6c261723dfca1d915e2c2dbb6ac54312203dec9680810cd78e08a31c0a670f47 WatchSource:0}: Error finding container 6c261723dfca1d915e2c2dbb6ac54312203dec9680810cd78e08a31c0a670f47: Status 404 returned error can't find the container with id 6c261723dfca1d915e2c2dbb6ac54312203dec9680810cd78e08a31c0a670f47 Oct 09 20:15:00 crc kubenswrapper[4907]: I1009 20:15:00.966568 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t"] Oct 09 20:15:01 crc kubenswrapper[4907]: I1009 20:15:01.927769 4907 generic.go:334] "Generic (PLEG): container finished" podID="d755a8cc-ff77-4c6f-9fde-62a156178521" containerID="064e3b6fce12a68b3de923ea72dc02f7cb02734ce3f729ad1c48db82ccabe46b" exitCode=0 Oct 09 20:15:01 crc kubenswrapper[4907]: I1009 20:15:01.927869 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" event={"ID":"d755a8cc-ff77-4c6f-9fde-62a156178521","Type":"ContainerDied","Data":"064e3b6fce12a68b3de923ea72dc02f7cb02734ce3f729ad1c48db82ccabe46b"} Oct 09 20:15:01 crc kubenswrapper[4907]: I1009 20:15:01.928224 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" event={"ID":"d755a8cc-ff77-4c6f-9fde-62a156178521","Type":"ContainerStarted","Data":"6c261723dfca1d915e2c2dbb6ac54312203dec9680810cd78e08a31c0a670f47"} Oct 09 20:15:01 crc kubenswrapper[4907]: I1009 20:15:01.931100 4907 generic.go:334] "Generic (PLEG): container finished" podID="7f8acc1c-79a2-43fc-9493-5baff3406366" containerID="d17ddfcaeb8c40752de4c617beae455229e669ef819725d7ea26a4bd1eb235e3" exitCode=0 Oct 09 20:15:01 crc kubenswrapper[4907]: I1009 20:15:01.931148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0e83-account-create-54jfg" event={"ID":"7f8acc1c-79a2-43fc-9493-5baff3406366","Type":"ContainerDied","Data":"d17ddfcaeb8c40752de4c617beae455229e669ef819725d7ea26a4bd1eb235e3"} Oct 09 20:15:01 crc kubenswrapper[4907]: I1009 20:15:01.937403 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 09 20:15:01 crc kubenswrapper[4907]: I1009 20:15:01.963397 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 09 20:15:02 crc kubenswrapper[4907]: I1009 20:15:02.945673 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.497319 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.508736 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0e83-account-create-54jfg" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.629510 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d755a8cc-ff77-4c6f-9fde-62a156178521-config-volume\") pod \"d755a8cc-ff77-4c6f-9fde-62a156178521\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.629919 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8t8n\" (UniqueName: \"kubernetes.io/projected/7f8acc1c-79a2-43fc-9493-5baff3406366-kube-api-access-l8t8n\") pod \"7f8acc1c-79a2-43fc-9493-5baff3406366\" (UID: \"7f8acc1c-79a2-43fc-9493-5baff3406366\") " Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.629977 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d755a8cc-ff77-4c6f-9fde-62a156178521-secret-volume\") pod \"d755a8cc-ff77-4c6f-9fde-62a156178521\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.629999 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9tjx\" (UniqueName: \"kubernetes.io/projected/d755a8cc-ff77-4c6f-9fde-62a156178521-kube-api-access-q9tjx\") pod \"d755a8cc-ff77-4c6f-9fde-62a156178521\" (UID: \"d755a8cc-ff77-4c6f-9fde-62a156178521\") " Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.630166 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d755a8cc-ff77-4c6f-9fde-62a156178521-config-volume" (OuterVolumeSpecName: "config-volume") pod "d755a8cc-ff77-4c6f-9fde-62a156178521" (UID: "d755a8cc-ff77-4c6f-9fde-62a156178521"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.630445 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d755a8cc-ff77-4c6f-9fde-62a156178521-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.636572 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d755a8cc-ff77-4c6f-9fde-62a156178521-kube-api-access-q9tjx" (OuterVolumeSpecName: "kube-api-access-q9tjx") pod "d755a8cc-ff77-4c6f-9fde-62a156178521" (UID: "d755a8cc-ff77-4c6f-9fde-62a156178521"). InnerVolumeSpecName "kube-api-access-q9tjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.644285 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d755a8cc-ff77-4c6f-9fde-62a156178521-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d755a8cc-ff77-4c6f-9fde-62a156178521" (UID: "d755a8cc-ff77-4c6f-9fde-62a156178521"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.644437 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8acc1c-79a2-43fc-9493-5baff3406366-kube-api-access-l8t8n" (OuterVolumeSpecName: "kube-api-access-l8t8n") pod "7f8acc1c-79a2-43fc-9493-5baff3406366" (UID: "7f8acc1c-79a2-43fc-9493-5baff3406366"). InnerVolumeSpecName "kube-api-access-l8t8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.731929 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8t8n\" (UniqueName: \"kubernetes.io/projected/7f8acc1c-79a2-43fc-9493-5baff3406366-kube-api-access-l8t8n\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.731966 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d755a8cc-ff77-4c6f-9fde-62a156178521-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.731979 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9tjx\" (UniqueName: \"kubernetes.io/projected/d755a8cc-ff77-4c6f-9fde-62a156178521-kube-api-access-q9tjx\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.954120 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0e83-account-create-54jfg" event={"ID":"7f8acc1c-79a2-43fc-9493-5baff3406366","Type":"ContainerDied","Data":"24bd203e63ad4fbd7be58a98003ea44420ac37519f12ce15cf919de9246ffd48"} Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.954155 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0e83-account-create-54jfg" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.954171 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24bd203e63ad4fbd7be58a98003ea44420ac37519f12ce15cf919de9246ffd48" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.956164 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" event={"ID":"d755a8cc-ff77-4c6f-9fde-62a156178521","Type":"ContainerDied","Data":"6c261723dfca1d915e2c2dbb6ac54312203dec9680810cd78e08a31c0a670f47"} Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.956218 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c261723dfca1d915e2c2dbb6ac54312203dec9680810cd78e08a31c0a670f47" Oct 09 20:15:03 crc kubenswrapper[4907]: I1009 20:15:03.956180 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334015-xx95t" Oct 09 20:15:04 crc kubenswrapper[4907]: I1009 20:15:04.581456 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct"] Oct 09 20:15:04 crc kubenswrapper[4907]: I1009 20:15:04.593282 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333970-4kdct"] Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.179213 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86eff4e6-938a-48fa-a116-c46597bc0868" path="/var/lib/kubelet/pods/86eff4e6-938a-48fa-a116-c46597bc0868/volumes" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.183432 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-kvg9t"] Oct 09 20:15:05 crc kubenswrapper[4907]: E1009 20:15:05.183880 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8acc1c-79a2-43fc-9493-5baff3406366" containerName="mariadb-account-create" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.183897 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8acc1c-79a2-43fc-9493-5baff3406366" containerName="mariadb-account-create" Oct 09 20:15:05 crc kubenswrapper[4907]: E1009 20:15:05.183925 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d755a8cc-ff77-4c6f-9fde-62a156178521" containerName="collect-profiles" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.183932 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d755a8cc-ff77-4c6f-9fde-62a156178521" containerName="collect-profiles" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.184139 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d755a8cc-ff77-4c6f-9fde-62a156178521" containerName="collect-profiles" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.184165 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8acc1c-79a2-43fc-9493-5baff3406366" containerName="mariadb-account-create" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.188874 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.195271 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.195347 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-7ggvq" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.195272 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.195422 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.198104 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-kvg9t"] Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.261272 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktc6n\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-kube-api-access-ktc6n\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.261835 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-config-data\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.261999 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-combined-ca-bundle\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.262386 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-scripts\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.263035 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-certs\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.374147 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-config-data\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.374357 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-combined-ca-bundle\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.374678 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-scripts\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.374762 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-certs\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.374865 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktc6n\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-kube-api-access-ktc6n\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.392429 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-combined-ca-bundle\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.395185 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-scripts\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.399871 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-config-data\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.401235 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktc6n\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-kube-api-access-ktc6n\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.414546 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-certs\") pod \"cloudkitty-db-sync-kvg9t\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:05 crc kubenswrapper[4907]: I1009 20:15:05.534736 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:06 crc kubenswrapper[4907]: I1009 20:15:06.088774 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-kvg9t"] Oct 09 20:15:06 crc kubenswrapper[4907]: I1009 20:15:06.999562 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-kvg9t" event={"ID":"3cb04018-212c-4798-85b3-85bd29b275d4","Type":"ContainerStarted","Data":"2c3b32794a536d54d19d417fcb27642a096b5dd4121862a5904e923af9a4a440"} Oct 09 20:15:07 crc kubenswrapper[4907]: I1009 20:15:07.449325 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-56cd74f89f-gl9k6" Oct 09 20:15:07 crc kubenswrapper[4907]: I1009 20:15:07.766615 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-68bbd7984c-gtcgb" Oct 09 20:15:07 crc kubenswrapper[4907]: I1009 20:15:07.832153 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-779849886d-s66fd" Oct 09 20:15:08 crc kubenswrapper[4907]: I1009 20:15:08.722401 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 09 20:15:08 crc kubenswrapper[4907]: I1009 20:15:08.784771 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Oct 09 20:15:09 crc kubenswrapper[4907]: I1009 20:15:09.037274 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Oct 09 20:15:18 crc kubenswrapper[4907]: I1009 20:15:18.721664 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 09 20:15:18 crc kubenswrapper[4907]: I1009 20:15:18.968655 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 20:15:20 crc kubenswrapper[4907]: I1009 20:15:20.155893 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-kvg9t" event={"ID":"3cb04018-212c-4798-85b3-85bd29b275d4","Type":"ContainerStarted","Data":"b421fc5ac8bf6a8a02506638d2dafbb7cea9187d96752319e5ae444604058805"} Oct 09 20:15:20 crc kubenswrapper[4907]: I1009 20:15:20.175764 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-kvg9t" podStartSLOduration=2.196784074 podStartE2EDuration="15.175742951s" podCreationTimestamp="2025-10-09 20:15:05 +0000 UTC" firstStartedPulling="2025-10-09 20:15:06.091402234 +0000 UTC m=+2791.623369733" lastFinishedPulling="2025-10-09 20:15:19.070361121 +0000 UTC m=+2804.602328610" observedRunningTime="2025-10-09 20:15:20.169767832 +0000 UTC m=+2805.701735341" watchObservedRunningTime="2025-10-09 20:15:20.175742951 +0000 UTC m=+2805.707710440" Oct 09 20:15:23 crc kubenswrapper[4907]: I1009 20:15:23.190982 4907 generic.go:334] "Generic (PLEG): container finished" podID="3cb04018-212c-4798-85b3-85bd29b275d4" containerID="b421fc5ac8bf6a8a02506638d2dafbb7cea9187d96752319e5ae444604058805" exitCode=0 Oct 09 20:15:23 crc kubenswrapper[4907]: I1009 20:15:23.191038 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-kvg9t" event={"ID":"3cb04018-212c-4798-85b3-85bd29b275d4","Type":"ContainerDied","Data":"b421fc5ac8bf6a8a02506638d2dafbb7cea9187d96752319e5ae444604058805"} Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.627583 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.748677 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-scripts\") pod \"3cb04018-212c-4798-85b3-85bd29b275d4\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.748724 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-combined-ca-bundle\") pod \"3cb04018-212c-4798-85b3-85bd29b275d4\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.748805 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-certs\") pod \"3cb04018-212c-4798-85b3-85bd29b275d4\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.748956 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-config-data\") pod \"3cb04018-212c-4798-85b3-85bd29b275d4\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.749005 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktc6n\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-kube-api-access-ktc6n\") pod \"3cb04018-212c-4798-85b3-85bd29b275d4\" (UID: \"3cb04018-212c-4798-85b3-85bd29b275d4\") " Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.756795 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-scripts" (OuterVolumeSpecName: "scripts") pod "3cb04018-212c-4798-85b3-85bd29b275d4" (UID: "3cb04018-212c-4798-85b3-85bd29b275d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.756957 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-kube-api-access-ktc6n" (OuterVolumeSpecName: "kube-api-access-ktc6n") pod "3cb04018-212c-4798-85b3-85bd29b275d4" (UID: "3cb04018-212c-4798-85b3-85bd29b275d4"). InnerVolumeSpecName "kube-api-access-ktc6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.757715 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-certs" (OuterVolumeSpecName: "certs") pod "3cb04018-212c-4798-85b3-85bd29b275d4" (UID: "3cb04018-212c-4798-85b3-85bd29b275d4"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.782748 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-config-data" (OuterVolumeSpecName: "config-data") pod "3cb04018-212c-4798-85b3-85bd29b275d4" (UID: "3cb04018-212c-4798-85b3-85bd29b275d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.785758 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cb04018-212c-4798-85b3-85bd29b275d4" (UID: "3cb04018-212c-4798-85b3-85bd29b275d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.851703 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.851769 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.851817 4907 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-certs\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.852038 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cb04018-212c-4798-85b3-85bd29b275d4-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:24 crc kubenswrapper[4907]: I1009 20:15:24.852105 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktc6n\" (UniqueName: \"kubernetes.io/projected/3cb04018-212c-4798-85b3-85bd29b275d4-kube-api-access-ktc6n\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.226566 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-kvg9t" event={"ID":"3cb04018-212c-4798-85b3-85bd29b275d4","Type":"ContainerDied","Data":"2c3b32794a536d54d19d417fcb27642a096b5dd4121862a5904e923af9a4a440"} Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.226620 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c3b32794a536d54d19d417fcb27642a096b5dd4121862a5904e923af9a4a440" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.227042 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-kvg9t" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.325554 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-7c8tt"] Oct 09 20:15:25 crc kubenswrapper[4907]: E1009 20:15:25.326111 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb04018-212c-4798-85b3-85bd29b275d4" containerName="cloudkitty-db-sync" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.326135 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb04018-212c-4798-85b3-85bd29b275d4" containerName="cloudkitty-db-sync" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.326340 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb04018-212c-4798-85b3-85bd29b275d4" containerName="cloudkitty-db-sync" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.327330 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.329374 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.329739 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-7ggvq" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.329880 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.331652 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.339023 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-7c8tt"] Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.464539 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-scripts\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.464646 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-certs\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.464855 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wg8f\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-kube-api-access-5wg8f\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.465209 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-combined-ca-bundle\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.465316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-config-data\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.567112 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-certs\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.567701 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wg8f\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-kube-api-access-5wg8f\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.567906 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-combined-ca-bundle\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.568057 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-config-data\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.568784 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-scripts\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.572983 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-combined-ca-bundle\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.573437 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-config-data\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.573535 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-certs\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.578123 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-scripts\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.588288 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wg8f\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-kube-api-access-5wg8f\") pod \"cloudkitty-storageinit-7c8tt\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:25 crc kubenswrapper[4907]: I1009 20:15:25.643171 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:26 crc kubenswrapper[4907]: I1009 20:15:26.129448 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-7c8tt"] Oct 09 20:15:26 crc kubenswrapper[4907]: I1009 20:15:26.237939 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-7c8tt" event={"ID":"69f91323-f2d1-4c69-952a-453677ac2d28","Type":"ContainerStarted","Data":"0dfcfb0c1f11459435ef7118f5966b6f456467be0717dc03b2ad1fca203a48cb"} Oct 09 20:15:27 crc kubenswrapper[4907]: I1009 20:15:27.255726 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-7c8tt" event={"ID":"69f91323-f2d1-4c69-952a-453677ac2d28","Type":"ContainerStarted","Data":"d043bc39c30b5457c23cb75307fe268ac009c04692962fe7a73935859b4fc925"} Oct 09 20:15:27 crc kubenswrapper[4907]: I1009 20:15:27.276879 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-7c8tt" podStartSLOduration=2.276862373 podStartE2EDuration="2.276862373s" podCreationTimestamp="2025-10-09 20:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 20:15:27.270990887 +0000 UTC m=+2812.802958406" watchObservedRunningTime="2025-10-09 20:15:27.276862373 +0000 UTC m=+2812.808829862" Oct 09 20:15:28 crc kubenswrapper[4907]: I1009 20:15:28.267993 4907 generic.go:334] "Generic (PLEG): container finished" podID="69f91323-f2d1-4c69-952a-453677ac2d28" containerID="d043bc39c30b5457c23cb75307fe268ac009c04692962fe7a73935859b4fc925" exitCode=0 Oct 09 20:15:28 crc kubenswrapper[4907]: I1009 20:15:28.268116 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-7c8tt" event={"ID":"69f91323-f2d1-4c69-952a-453677ac2d28","Type":"ContainerDied","Data":"d043bc39c30b5457c23cb75307fe268ac009c04692962fe7a73935859b4fc925"} Oct 09 20:15:28 crc kubenswrapper[4907]: E1009 20:15:28.332924 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f91323_f2d1_4c69_952a_453677ac2d28.slice/crio-conmon-d043bc39c30b5457c23cb75307fe268ac009c04692962fe7a73935859b4fc925.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f91323_f2d1_4c69_952a_453677ac2d28.slice/crio-d043bc39c30b5457c23cb75307fe268ac009c04692962fe7a73935859b4fc925.scope\": RecentStats: unable to find data in memory cache]" Oct 09 20:15:28 crc kubenswrapper[4907]: I1009 20:15:28.723074 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.679699 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.764073 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-combined-ca-bundle\") pod \"69f91323-f2d1-4c69-952a-453677ac2d28\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.764185 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-certs\") pod \"69f91323-f2d1-4c69-952a-453677ac2d28\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.764245 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-config-data\") pod \"69f91323-f2d1-4c69-952a-453677ac2d28\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.764316 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wg8f\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-kube-api-access-5wg8f\") pod \"69f91323-f2d1-4c69-952a-453677ac2d28\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.764451 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-scripts\") pod \"69f91323-f2d1-4c69-952a-453677ac2d28\" (UID: \"69f91323-f2d1-4c69-952a-453677ac2d28\") " Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.770338 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-scripts" (OuterVolumeSpecName: "scripts") pod "69f91323-f2d1-4c69-952a-453677ac2d28" (UID: "69f91323-f2d1-4c69-952a-453677ac2d28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.770511 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-certs" (OuterVolumeSpecName: "certs") pod "69f91323-f2d1-4c69-952a-453677ac2d28" (UID: "69f91323-f2d1-4c69-952a-453677ac2d28"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.770649 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-kube-api-access-5wg8f" (OuterVolumeSpecName: "kube-api-access-5wg8f") pod "69f91323-f2d1-4c69-952a-453677ac2d28" (UID: "69f91323-f2d1-4c69-952a-453677ac2d28"). InnerVolumeSpecName "kube-api-access-5wg8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.794758 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69f91323-f2d1-4c69-952a-453677ac2d28" (UID: "69f91323-f2d1-4c69-952a-453677ac2d28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.805161 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-config-data" (OuterVolumeSpecName: "config-data") pod "69f91323-f2d1-4c69-952a-453677ac2d28" (UID: "69f91323-f2d1-4c69-952a-453677ac2d28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.868498 4907 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-certs\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.869318 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.869402 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wg8f\" (UniqueName: \"kubernetes.io/projected/69f91323-f2d1-4c69-952a-453677ac2d28-kube-api-access-5wg8f\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.869516 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:29 crc kubenswrapper[4907]: I1009 20:15:29.869606 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f91323-f2d1-4c69-952a-453677ac2d28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.291093 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-7c8tt" event={"ID":"69f91323-f2d1-4c69-952a-453677ac2d28","Type":"ContainerDied","Data":"0dfcfb0c1f11459435ef7118f5966b6f456467be0717dc03b2ad1fca203a48cb"} Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.291449 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dfcfb0c1f11459435ef7118f5966b6f456467be0717dc03b2ad1fca203a48cb" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.291182 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-7c8tt" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.465926 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Oct 09 20:15:30 crc kubenswrapper[4907]: E1009 20:15:30.466580 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f91323-f2d1-4c69-952a-453677ac2d28" containerName="cloudkitty-storageinit" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.466608 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f91323-f2d1-4c69-952a-453677ac2d28" containerName="cloudkitty-storageinit" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.466893 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f91323-f2d1-4c69-952a-453677ac2d28" containerName="cloudkitty-storageinit" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.467860 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.472058 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.472292 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.472541 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-7ggvq" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.472593 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.472590 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.493637 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.582762 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.582915 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-config-data\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.583004 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.583212 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-scripts\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.583292 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4c613c00-9feb-4432-9d03-b980178cbe26-certs\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.583361 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph4lh\" (UniqueName: \"kubernetes.io/projected/4c613c00-9feb-4432-9d03-b980178cbe26-kube-api-access-ph4lh\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.632233 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.634438 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.640028 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.648433 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685357 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-certs\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685418 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-scripts\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685460 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685498 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-config-data\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685561 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685590 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-scripts\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685610 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4c613c00-9feb-4432-9d03-b980178cbe26-certs\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685629 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph4lh\" (UniqueName: \"kubernetes.io/projected/4c613c00-9feb-4432-9d03-b980178cbe26-kube-api-access-ph4lh\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685647 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj9c8\" (UniqueName: \"kubernetes.io/projected/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-kube-api-access-sj9c8\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685704 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-logs\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685725 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685772 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-config-data\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.685788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.692110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-scripts\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.693734 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-config-data\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.694212 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4c613c00-9feb-4432-9d03-b980178cbe26-certs\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.697203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.702997 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c613c00-9feb-4432-9d03-b980178cbe26-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.703670 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph4lh\" (UniqueName: \"kubernetes.io/projected/4c613c00-9feb-4432-9d03-b980178cbe26-kube-api-access-ph4lh\") pod \"cloudkitty-proc-0\" (UID: \"4c613c00-9feb-4432-9d03-b980178cbe26\") " pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.787741 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-logs\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.787835 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.787877 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-certs\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.787904 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-scripts\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.787936 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-config-data\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.787983 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.788019 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj9c8\" (UniqueName: \"kubernetes.io/projected/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-kube-api-access-sj9c8\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.788127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-logs\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.793258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-config-data\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.793379 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.793648 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.793783 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.797424 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-scripts\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.797967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-certs\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.821017 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj9c8\" (UniqueName: \"kubernetes.io/projected/9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9-kube-api-access-sj9c8\") pod \"cloudkitty-api-0\" (UID: \"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9\") " pod="openstack/cloudkitty-api-0" Oct 09 20:15:30 crc kubenswrapper[4907]: I1009 20:15:30.962823 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Oct 09 20:15:31 crc kubenswrapper[4907]: I1009 20:15:31.380545 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Oct 09 20:15:31 crc kubenswrapper[4907]: I1009 20:15:31.641178 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Oct 09 20:15:32 crc kubenswrapper[4907]: I1009 20:15:32.313726 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9","Type":"ContainerStarted","Data":"a05caa3e441071d2523dc47b37b92538fe7847ee9238d55f3041418f1a5aa994"} Oct 09 20:15:32 crc kubenswrapper[4907]: I1009 20:15:32.314200 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9","Type":"ContainerStarted","Data":"3c28495b43efbe7b86b289bfb5acdd72c86d4c1567b3bd94722f32270f594ae2"} Oct 09 20:15:32 crc kubenswrapper[4907]: I1009 20:15:32.314219 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Oct 09 20:15:32 crc kubenswrapper[4907]: I1009 20:15:32.314233 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9","Type":"ContainerStarted","Data":"3a2aa99145b65a9e0feb4d2994d38d465eab8f6789b0888f5547b4089bc9587b"} Oct 09 20:15:32 crc kubenswrapper[4907]: I1009 20:15:32.314931 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"4c613c00-9feb-4432-9d03-b980178cbe26","Type":"ContainerStarted","Data":"e21d2e4af010e742c99dc10fda916a99d71be7b360d3613aee3fe7a0c91920df"} Oct 09 20:15:32 crc kubenswrapper[4907]: I1009 20:15:32.338519 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.338500291 podStartE2EDuration="2.338500291s" podCreationTimestamp="2025-10-09 20:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 20:15:32.329037954 +0000 UTC m=+2817.861005453" watchObservedRunningTime="2025-10-09 20:15:32.338500291 +0000 UTC m=+2817.870467780" Oct 09 20:15:34 crc kubenswrapper[4907]: I1009 20:15:34.336611 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"4c613c00-9feb-4432-9d03-b980178cbe26","Type":"ContainerStarted","Data":"bf1442663c866986bb6c86fd485552ee41c49568fd88d792d5b8f2c637cf1c2e"} Oct 09 20:15:34 crc kubenswrapper[4907]: I1009 20:15:34.357015 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.54723936 podStartE2EDuration="4.356995093s" podCreationTimestamp="2025-10-09 20:15:30 +0000 UTC" firstStartedPulling="2025-10-09 20:15:31.384059668 +0000 UTC m=+2816.916027157" lastFinishedPulling="2025-10-09 20:15:33.193815391 +0000 UTC m=+2818.725782890" observedRunningTime="2025-10-09 20:15:34.351573217 +0000 UTC m=+2819.883540716" watchObservedRunningTime="2025-10-09 20:15:34.356995093 +0000 UTC m=+2819.888962582" Oct 09 20:15:38 crc kubenswrapper[4907]: I1009 20:15:38.721800 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 09 20:15:41 crc kubenswrapper[4907]: I1009 20:15:41.582518 4907 scope.go:117] "RemoveContainer" containerID="ecc416dcb1792cb9d8173232b33a63f5671bfd94f2df0f527d6ddd225f51c7f6" Oct 09 20:15:43 crc kubenswrapper[4907]: I1009 20:15:43.631257 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:15:43 crc kubenswrapper[4907]: I1009 20:15:43.632110 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="ceilometer-central-agent" containerID="cri-o://f48d9e50a44d366055ca54facb6090a89759038d0919ff38c294907c7a9a275c" gracePeriod=30 Oct 09 20:15:43 crc kubenswrapper[4907]: I1009 20:15:43.632174 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="sg-core" containerID="cri-o://1725b96181c6e970bfd9df43acd134327389c5592a6be148c16b2d384f29700f" gracePeriod=30 Oct 09 20:15:43 crc kubenswrapper[4907]: I1009 20:15:43.632182 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="proxy-httpd" containerID="cri-o://d33783f2bbe5ebf346dd7b6037dd0bf920df86595c40c175ce1c769027602c9b" gracePeriod=30 Oct 09 20:15:43 crc kubenswrapper[4907]: I1009 20:15:43.632265 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="ceilometer-notification-agent" containerID="cri-o://2211b364fea9c948032728c78597a6a3f9a058fe6967d914af4c36ac26ba0e56" gracePeriod=30 Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465331 4907 generic.go:334] "Generic (PLEG): container finished" podID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerID="d33783f2bbe5ebf346dd7b6037dd0bf920df86595c40c175ce1c769027602c9b" exitCode=0 Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465634 4907 generic.go:334] "Generic (PLEG): container finished" podID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerID="1725b96181c6e970bfd9df43acd134327389c5592a6be148c16b2d384f29700f" exitCode=2 Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerDied","Data":"d33783f2bbe5ebf346dd7b6037dd0bf920df86595c40c175ce1c769027602c9b"} Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465680 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerDied","Data":"1725b96181c6e970bfd9df43acd134327389c5592a6be148c16b2d384f29700f"} Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465693 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerDied","Data":"2211b364fea9c948032728c78597a6a3f9a058fe6967d914af4c36ac26ba0e56"} Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465645 4907 generic.go:334] "Generic (PLEG): container finished" podID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerID="2211b364fea9c948032728c78597a6a3f9a058fe6967d914af4c36ac26ba0e56" exitCode=0 Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465716 4907 generic.go:334] "Generic (PLEG): container finished" podID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerID="f48d9e50a44d366055ca54facb6090a89759038d0919ff38c294907c7a9a275c" exitCode=0 Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465733 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerDied","Data":"f48d9e50a44d366055ca54facb6090a89759038d0919ff38c294907c7a9a275c"} Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465742 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f94f166b-aee9-436b-9ced-297ca8cdc96a","Type":"ContainerDied","Data":"cf9106594443b9b6d5770677cc2d5ff9eb010373420d17c1f66bf52228a42e07"} Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.465751 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf9106594443b9b6d5770677cc2d5ff9eb010373420d17c1f66bf52228a42e07" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.504681 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.588261 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-run-httpd\") pod \"f94f166b-aee9-436b-9ced-297ca8cdc96a\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.588388 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-log-httpd\") pod \"f94f166b-aee9-436b-9ced-297ca8cdc96a\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.588418 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-sg-core-conf-yaml\") pod \"f94f166b-aee9-436b-9ced-297ca8cdc96a\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.588520 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjlvm\" (UniqueName: \"kubernetes.io/projected/f94f166b-aee9-436b-9ced-297ca8cdc96a-kube-api-access-vjlvm\") pod \"f94f166b-aee9-436b-9ced-297ca8cdc96a\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.588563 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-config-data\") pod \"f94f166b-aee9-436b-9ced-297ca8cdc96a\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.588699 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-scripts\") pod \"f94f166b-aee9-436b-9ced-297ca8cdc96a\" (UID: \"f94f166b-aee9-436b-9ced-297ca8cdc96a\") " Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.590558 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f94f166b-aee9-436b-9ced-297ca8cdc96a" (UID: "f94f166b-aee9-436b-9ced-297ca8cdc96a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.593646 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f94f166b-aee9-436b-9ced-297ca8cdc96a" (UID: "f94f166b-aee9-436b-9ced-297ca8cdc96a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.604195 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94f166b-aee9-436b-9ced-297ca8cdc96a-kube-api-access-vjlvm" (OuterVolumeSpecName: "kube-api-access-vjlvm") pod "f94f166b-aee9-436b-9ced-297ca8cdc96a" (UID: "f94f166b-aee9-436b-9ced-297ca8cdc96a"). InnerVolumeSpecName "kube-api-access-vjlvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.609618 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-scripts" (OuterVolumeSpecName: "scripts") pod "f94f166b-aee9-436b-9ced-297ca8cdc96a" (UID: "f94f166b-aee9-436b-9ced-297ca8cdc96a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.659630 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f94f166b-aee9-436b-9ced-297ca8cdc96a" (UID: "f94f166b-aee9-436b-9ced-297ca8cdc96a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.700798 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.700835 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.700843 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f94f166b-aee9-436b-9ced-297ca8cdc96a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.700851 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.700861 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjlvm\" (UniqueName: \"kubernetes.io/projected/f94f166b-aee9-436b-9ced-297ca8cdc96a-kube-api-access-vjlvm\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.781614 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-config-data" (OuterVolumeSpecName: "config-data") pod "f94f166b-aee9-436b-9ced-297ca8cdc96a" (UID: "f94f166b-aee9-436b-9ced-297ca8cdc96a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:15:44 crc kubenswrapper[4907]: I1009 20:15:44.802287 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94f166b-aee9-436b-9ced-297ca8cdc96a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.478252 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.517318 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.538791 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.550122 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:15:45 crc kubenswrapper[4907]: E1009 20:15:45.550924 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="ceilometer-central-agent" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.550949 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="ceilometer-central-agent" Oct 09 20:15:45 crc kubenswrapper[4907]: E1009 20:15:45.550968 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="proxy-httpd" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.550975 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="proxy-httpd" Oct 09 20:15:45 crc kubenswrapper[4907]: E1009 20:15:45.551000 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="sg-core" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.551008 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="sg-core" Oct 09 20:15:45 crc kubenswrapper[4907]: E1009 20:15:45.551033 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="ceilometer-notification-agent" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.551042 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="ceilometer-notification-agent" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.551286 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="ceilometer-notification-agent" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.551308 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="sg-core" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.551333 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="ceilometer-central-agent" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.551353 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" containerName="proxy-httpd" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.553734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.558429 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-sv5rk" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.561842 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.562016 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.563221 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.618266 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-log-httpd\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.618528 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-run-httpd\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.618683 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdm5s\" (UniqueName: \"kubernetes.io/projected/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-kube-api-access-rdm5s\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.619012 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-scripts\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.619174 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-config-data\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.619283 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.721536 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-log-httpd\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.721628 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-run-httpd\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.721755 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdm5s\" (UniqueName: \"kubernetes.io/projected/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-kube-api-access-rdm5s\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.721852 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-scripts\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.721931 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-config-data\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.721970 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.722190 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-log-httpd\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.722211 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-run-httpd\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.726381 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.726675 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-scripts\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.736053 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-config-data\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.742172 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdm5s\" (UniqueName: \"kubernetes.io/projected/38c4b2fd-cdaf-4217-a934-3210e6eb4f80-kube-api-access-rdm5s\") pod \"ceilometer-0\" (UID: \"38c4b2fd-cdaf-4217-a934-3210e6eb4f80\") " pod="openstack/ceilometer-0" Oct 09 20:15:45 crc kubenswrapper[4907]: I1009 20:15:45.876086 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 20:15:46 crc kubenswrapper[4907]: I1009 20:15:46.356768 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 20:15:46 crc kubenswrapper[4907]: I1009 20:15:46.500008 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38c4b2fd-cdaf-4217-a934-3210e6eb4f80","Type":"ContainerStarted","Data":"8af9c6a342ec309d9efe8dc838f6a6fafb31a4af3fd1859b114fef335e290078"} Oct 09 20:15:47 crc kubenswrapper[4907]: I1009 20:15:47.163165 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94f166b-aee9-436b-9ced-297ca8cdc96a" path="/var/lib/kubelet/pods/f94f166b-aee9-436b-9ced-297ca8cdc96a/volumes" Oct 09 20:15:47 crc kubenswrapper[4907]: I1009 20:15:47.514850 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38c4b2fd-cdaf-4217-a934-3210e6eb4f80","Type":"ContainerStarted","Data":"eeb421fad6cfccaebf04d54180746f90cb650ab212df68a03521b29448c85755"} Oct 09 20:15:48 crc kubenswrapper[4907]: I1009 20:15:48.527148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38c4b2fd-cdaf-4217-a934-3210e6eb4f80","Type":"ContainerStarted","Data":"bf2c682af4556ccef7c1d542a8cb513b258b0b02503901df81eb6e6c6e1a62f8"} Oct 09 20:15:48 crc kubenswrapper[4907]: I1009 20:15:48.909220 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Oct 09 20:15:49 crc kubenswrapper[4907]: I1009 20:15:49.550051 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38c4b2fd-cdaf-4217-a934-3210e6eb4f80","Type":"ContainerStarted","Data":"c878649b69e6727367d83d971624ae5ec8f23eb124ef037ba7a62a6be8073c5f"} Oct 09 20:15:50 crc kubenswrapper[4907]: I1009 20:15:50.562398 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38c4b2fd-cdaf-4217-a934-3210e6eb4f80","Type":"ContainerStarted","Data":"c55c5e874852f796f473dfbc2bb9aaec62f73bd3668f71ae011aaf75ed4b3ada"} Oct 09 20:15:50 crc kubenswrapper[4907]: I1009 20:15:50.563887 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 20:15:50 crc kubenswrapper[4907]: I1009 20:15:50.602733 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.167275329 podStartE2EDuration="5.602707283s" podCreationTimestamp="2025-10-09 20:15:45 +0000 UTC" firstStartedPulling="2025-10-09 20:15:46.354705535 +0000 UTC m=+2831.886673024" lastFinishedPulling="2025-10-09 20:15:49.790137479 +0000 UTC m=+2835.322104978" observedRunningTime="2025-10-09 20:15:50.586865518 +0000 UTC m=+2836.118833097" watchObservedRunningTime="2025-10-09 20:15:50.602707283 +0000 UTC m=+2836.134674802" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.092895 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p6p94"] Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.096125 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.108863 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6p94"] Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.206823 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-utilities\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.206975 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvcm6\" (UniqueName: \"kubernetes.io/projected/38fdb242-344f-42a1-978f-59315af85ce1-kube-api-access-hvcm6\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.207079 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-catalog-content\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.309250 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-catalog-content\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.309338 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-utilities\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.309418 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvcm6\" (UniqueName: \"kubernetes.io/projected/38fdb242-344f-42a1-978f-59315af85ce1-kube-api-access-hvcm6\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.309802 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-catalog-content\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.309933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-utilities\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.330269 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvcm6\" (UniqueName: \"kubernetes.io/projected/38fdb242-344f-42a1-978f-59315af85ce1-kube-api-access-hvcm6\") pod \"certified-operators-p6p94\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.442627 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:15:55 crc kubenswrapper[4907]: I1009 20:15:55.976007 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6p94"] Oct 09 20:15:56 crc kubenswrapper[4907]: I1009 20:15:56.627162 4907 generic.go:334] "Generic (PLEG): container finished" podID="38fdb242-344f-42a1-978f-59315af85ce1" containerID="381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376" exitCode=0 Oct 09 20:15:56 crc kubenswrapper[4907]: I1009 20:15:56.627222 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6p94" event={"ID":"38fdb242-344f-42a1-978f-59315af85ce1","Type":"ContainerDied","Data":"381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376"} Oct 09 20:15:56 crc kubenswrapper[4907]: I1009 20:15:56.627460 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6p94" event={"ID":"38fdb242-344f-42a1-978f-59315af85ce1","Type":"ContainerStarted","Data":"f7cf3977f74074a28ff85ad9a6f2a038f9109b2053f6738d7371adba40a4c910"} Oct 09 20:15:57 crc kubenswrapper[4907]: I1009 20:15:57.640940 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6p94" event={"ID":"38fdb242-344f-42a1-978f-59315af85ce1","Type":"ContainerStarted","Data":"484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1"} Oct 09 20:15:59 crc kubenswrapper[4907]: I1009 20:15:59.667065 4907 generic.go:334] "Generic (PLEG): container finished" podID="38fdb242-344f-42a1-978f-59315af85ce1" containerID="484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1" exitCode=0 Oct 09 20:15:59 crc kubenswrapper[4907]: I1009 20:15:59.667142 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6p94" event={"ID":"38fdb242-344f-42a1-978f-59315af85ce1","Type":"ContainerDied","Data":"484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1"} Oct 09 20:16:00 crc kubenswrapper[4907]: I1009 20:16:00.684342 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6p94" event={"ID":"38fdb242-344f-42a1-978f-59315af85ce1","Type":"ContainerStarted","Data":"b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4"} Oct 09 20:16:00 crc kubenswrapper[4907]: I1009 20:16:00.726080 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p6p94" podStartSLOduration=2.098227257 podStartE2EDuration="5.726056681s" podCreationTimestamp="2025-10-09 20:15:55 +0000 UTC" firstStartedPulling="2025-10-09 20:15:56.629153663 +0000 UTC m=+2842.161121152" lastFinishedPulling="2025-10-09 20:16:00.256983097 +0000 UTC m=+2845.788950576" observedRunningTime="2025-10-09 20:16:00.719641931 +0000 UTC m=+2846.251609420" watchObservedRunningTime="2025-10-09 20:16:00.726056681 +0000 UTC m=+2846.258024170" Oct 09 20:16:05 crc kubenswrapper[4907]: I1009 20:16:05.443696 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:16:05 crc kubenswrapper[4907]: I1009 20:16:05.444344 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:16:05 crc kubenswrapper[4907]: I1009 20:16:05.508749 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:16:05 crc kubenswrapper[4907]: I1009 20:16:05.820161 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:16:05 crc kubenswrapper[4907]: I1009 20:16:05.879075 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6p94"] Oct 09 20:16:07 crc kubenswrapper[4907]: I1009 20:16:07.777810 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p6p94" podUID="38fdb242-344f-42a1-978f-59315af85ce1" containerName="registry-server" containerID="cri-o://b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4" gracePeriod=2 Oct 09 20:16:07 crc kubenswrapper[4907]: I1009 20:16:07.791015 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.374975 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.530843 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvcm6\" (UniqueName: \"kubernetes.io/projected/38fdb242-344f-42a1-978f-59315af85ce1-kube-api-access-hvcm6\") pod \"38fdb242-344f-42a1-978f-59315af85ce1\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.530946 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-catalog-content\") pod \"38fdb242-344f-42a1-978f-59315af85ce1\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.531179 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-utilities\") pod \"38fdb242-344f-42a1-978f-59315af85ce1\" (UID: \"38fdb242-344f-42a1-978f-59315af85ce1\") " Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.531815 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-utilities" (OuterVolumeSpecName: "utilities") pod "38fdb242-344f-42a1-978f-59315af85ce1" (UID: "38fdb242-344f-42a1-978f-59315af85ce1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.538046 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38fdb242-344f-42a1-978f-59315af85ce1-kube-api-access-hvcm6" (OuterVolumeSpecName: "kube-api-access-hvcm6") pod "38fdb242-344f-42a1-978f-59315af85ce1" (UID: "38fdb242-344f-42a1-978f-59315af85ce1"). InnerVolumeSpecName "kube-api-access-hvcm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.572927 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38fdb242-344f-42a1-978f-59315af85ce1" (UID: "38fdb242-344f-42a1-978f-59315af85ce1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.633479 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.633512 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvcm6\" (UniqueName: \"kubernetes.io/projected/38fdb242-344f-42a1-978f-59315af85ce1-kube-api-access-hvcm6\") on node \"crc\" DevicePath \"\"" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.633523 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38fdb242-344f-42a1-978f-59315af85ce1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.790433 4907 generic.go:334] "Generic (PLEG): container finished" podID="38fdb242-344f-42a1-978f-59315af85ce1" containerID="b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4" exitCode=0 Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.790501 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6p94" event={"ID":"38fdb242-344f-42a1-978f-59315af85ce1","Type":"ContainerDied","Data":"b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4"} Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.790585 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6p94" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.790600 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6p94" event={"ID":"38fdb242-344f-42a1-978f-59315af85ce1","Type":"ContainerDied","Data":"f7cf3977f74074a28ff85ad9a6f2a038f9109b2053f6738d7371adba40a4c910"} Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.790630 4907 scope.go:117] "RemoveContainer" containerID="b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.824645 4907 scope.go:117] "RemoveContainer" containerID="484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.843108 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6p94"] Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.854882 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p6p94"] Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.865541 4907 scope.go:117] "RemoveContainer" containerID="381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.928107 4907 scope.go:117] "RemoveContainer" containerID="b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4" Oct 09 20:16:08 crc kubenswrapper[4907]: E1009 20:16:08.928899 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4\": container with ID starting with b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4 not found: ID does not exist" containerID="b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.928949 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4"} err="failed to get container status \"b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4\": rpc error: code = NotFound desc = could not find container \"b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4\": container with ID starting with b36a34d887918467711d9dbcfda037905abdf45f586fe19b53dbda771ce550a4 not found: ID does not exist" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.928983 4907 scope.go:117] "RemoveContainer" containerID="484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1" Oct 09 20:16:08 crc kubenswrapper[4907]: E1009 20:16:08.929845 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1\": container with ID starting with 484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1 not found: ID does not exist" containerID="484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.929907 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1"} err="failed to get container status \"484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1\": rpc error: code = NotFound desc = could not find container \"484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1\": container with ID starting with 484a23ed8f947342000bebd04765d76e24c2d32e547ece78677307c4ee4e93f1 not found: ID does not exist" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.929940 4907 scope.go:117] "RemoveContainer" containerID="381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376" Oct 09 20:16:08 crc kubenswrapper[4907]: E1009 20:16:08.930451 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376\": container with ID starting with 381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376 not found: ID does not exist" containerID="381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376" Oct 09 20:16:08 crc kubenswrapper[4907]: I1009 20:16:08.930700 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376"} err="failed to get container status \"381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376\": rpc error: code = NotFound desc = could not find container \"381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376\": container with ID starting with 381f4d70ef0a76d0fce3e603ef8b80b2e7d4e57381facb3e351b0fc79ed3f376 not found: ID does not exist" Oct 09 20:16:09 crc kubenswrapper[4907]: I1009 20:16:09.176935 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38fdb242-344f-42a1-978f-59315af85ce1" path="/var/lib/kubelet/pods/38fdb242-344f-42a1-978f-59315af85ce1/volumes" Oct 09 20:16:15 crc kubenswrapper[4907]: I1009 20:16:15.883130 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 20:16:36 crc kubenswrapper[4907]: I1009 20:16:36.299130 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:16:36 crc kubenswrapper[4907]: I1009 20:16:36.299842 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.526373 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ansible-tests-cloudkitty-s00-cloudkitty"] Oct 09 20:16:58 crc kubenswrapper[4907]: E1009 20:16:58.527250 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fdb242-344f-42a1-978f-59315af85ce1" containerName="registry-server" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.527262 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fdb242-344f-42a1-978f-59315af85ce1" containerName="registry-server" Oct 09 20:16:58 crc kubenswrapper[4907]: E1009 20:16:58.527276 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fdb242-344f-42a1-978f-59315af85ce1" containerName="extract-utilities" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.527283 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fdb242-344f-42a1-978f-59315af85ce1" containerName="extract-utilities" Oct 09 20:16:58 crc kubenswrapper[4907]: E1009 20:16:58.527295 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fdb242-344f-42a1-978f-59315af85ce1" containerName="extract-content" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.527301 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fdb242-344f-42a1-978f-59315af85ce1" containerName="extract-content" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.527538 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="38fdb242-344f-42a1-978f-59315af85ce1" containerName="registry-server" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.528496 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.532849 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-f86fh" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.541880 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.555659 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansible-tests-cloudkitty-s00-cloudkitty"] Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.646400 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-workload-ssh-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.646497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.646552 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-ca-certs\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.646634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.646675 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-workdir\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.646753 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.646987 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-compute-ssh-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.647104 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-temporary\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.748929 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-temporary\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.749043 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-workload-ssh-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.749072 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.749526 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-temporary\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.749820 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-ca-certs\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.749880 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.749971 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-workdir\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.750422 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.750666 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-workdir\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.751529 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.751623 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-compute-ssh-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.754125 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.754168 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/82e01c0082acc3830f25a8bdd7dc43b96d87a99163a1af6217cbc20f7f2abe43/globalmount\"" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.757541 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-compute-ssh-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.758233 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-workload-ssh-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.773299 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config-secret\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.778941 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-ca-certs\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.817414 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\") pod \"ansible-tests-cloudkitty-s00-cloudkitty\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:58 crc kubenswrapper[4907]: I1009 20:16:58.860418 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:16:59 crc kubenswrapper[4907]: I1009 20:16:59.311069 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 20:16:59 crc kubenswrapper[4907]: I1009 20:16:59.319206 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansible-tests-cloudkitty-s00-cloudkitty"] Oct 09 20:16:59 crc kubenswrapper[4907]: I1009 20:16:59.435338 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" event={"ID":"92e389b9-a749-4f7e-9c0a-3c901329ff51","Type":"ContainerStarted","Data":"d7999b012b0601e398c87bd99c22c088018a202b2a292279688ce95958b59a6c"} Oct 09 20:17:06 crc kubenswrapper[4907]: I1009 20:17:06.299706 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:17:06 crc kubenswrapper[4907]: I1009 20:17:06.300631 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:17:14 crc kubenswrapper[4907]: E1009 20:17:14.070874 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified" Oct 09 20:17:14 crc kubenswrapper[4907]: E1009 20:17:14.071577 4907 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 09 20:17:14 crc kubenswrapper[4907]: container &Container{Name:ansible-tests-cloudkitty,Image:quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_ANSIBLE_EXTRA_VARS,Value:,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_FILE_EXTRA_VARS,Value:cifmw_openshift_kubeconfig: ~/.kube.config Oct 09 20:17:14 crc kubenswrapper[4907]: cifmw_path: "{{ ansible_env.PATH }}:/var/lib/ansible/bin" Oct 09 20:17:14 crc kubenswrapper[4907]: cifmw_openshift_api: api.crc.testing:6443 Oct 09 20:17:14 crc kubenswrapper[4907]: cifmw_openshift_password: '12345678' Oct 09 20:17:14 crc kubenswrapper[4907]: cifmw_openshift_user: kubeadmin Oct 09 20:17:14 crc kubenswrapper[4907]: openstack_cmd: "oc -n openstack rsh openstackclient openstack" Oct 09 20:17:14 crc kubenswrapper[4907]: patch_observabilityclient: true Oct 09 20:17:14 crc kubenswrapper[4907]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_BRANCH,Value:,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_REPO,Value:http://github.com/elfiesmelfie/feature-verification-tests,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_INVENTORY,Value:localhost ansible_connection=local ansible_python_interpreter=python3 Oct 09 20:17:14 crc kubenswrapper[4907]: compute-0 ansible_host=192.168.122.100 ansible_user=root ansible_ssh_private_key_file=/var/lib/ansible/.ssh/compute_id Oct 09 20:17:14 crc kubenswrapper[4907]: compute-1 ansible_host=192.168.122.101 ansible_user=root ansible_ssh_private_key_file=/var/lib/ansible/.ssh/compute_id Oct 09 20:17:14 crc kubenswrapper[4907]: [compute] Oct 09 20:17:14 crc kubenswrapper[4907]: compute-0 Oct 09 20:17:14 crc kubenswrapper[4907]: compute-1 Oct 09 20:17:14 crc kubenswrapper[4907]: [controller] Oct 09 20:17:14 crc kubenswrapper[4907]: localhost Oct 09 20:17:14 crc kubenswrapper[4907]: [local] Oct 09 20:17:14 crc kubenswrapper[4907]: localhost Oct 09 20:17:14 crc kubenswrapper[4907]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_PLAYBOOK,Value:ci/run_cloudkitty_tests.yml,ValueFrom:nil,},EnvVar{Name:POD_DEBUG,Value:true,ValueFrom:nil,},EnvVar{Name:POD_INSTALL_COLLECTIONS,Value:git+https://github.com/elfiesmelfie/feature-verification-tests.git,master,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{4 0} {} 4 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/ansible,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/AnsibleTests/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/ansible/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/var/lib/ansible/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:workload-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/test_keypair.key,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:compute-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/.ssh/compute_id,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*227,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:*227,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ansible-tests-cloudkitty-s00-cloudkitty_openstack(92e389b9-a749-4f7e-9c0a-3c901329ff51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Oct 09 20:17:14 crc kubenswrapper[4907]: > logger="UnhandledError" Oct 09 20:17:14 crc kubenswrapper[4907]: E1009 20:17:14.072766 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansible-tests-cloudkitty\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" podUID="92e389b9-a749-4f7e-9c0a-3c901329ff51" Oct 09 20:17:14 crc kubenswrapper[4907]: E1009 20:17:14.598631 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansible-tests-cloudkitty\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified\\\"\"" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" podUID="92e389b9-a749-4f7e-9c0a-3c901329ff51" Oct 09 20:17:26 crc kubenswrapper[4907]: I1009 20:17:26.737971 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" event={"ID":"92e389b9-a749-4f7e-9c0a-3c901329ff51","Type":"ContainerStarted","Data":"1cc183aa4a4d03286d8d94be68e4bc02f11513f0d8e6b696dbe548d3810980f6"} Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.457277 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" podStartSLOduration=12.122541498 podStartE2EDuration="38.45725129s" podCreationTimestamp="2025-10-09 20:16:57 +0000 UTC" firstStartedPulling="2025-10-09 20:16:59.310827492 +0000 UTC m=+2904.842794981" lastFinishedPulling="2025-10-09 20:17:25.645537284 +0000 UTC m=+2931.177504773" observedRunningTime="2025-10-09 20:17:26.774832542 +0000 UTC m=+2932.306800041" watchObservedRunningTime="2025-10-09 20:17:35.45725129 +0000 UTC m=+2940.989218779" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.458688 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tr2lk"] Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.461429 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.482399 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tr2lk"] Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.646756 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxvd7\" (UniqueName: \"kubernetes.io/projected/e618b793-2764-4b56-a89e-a9ad8a3e9a94-kube-api-access-hxvd7\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.646834 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-catalog-content\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.647245 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-utilities\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.749136 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-utilities\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.749228 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxvd7\" (UniqueName: \"kubernetes.io/projected/e618b793-2764-4b56-a89e-a9ad8a3e9a94-kube-api-access-hxvd7\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.749246 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-catalog-content\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.750130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-catalog-content\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.750373 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-utilities\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.782641 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxvd7\" (UniqueName: \"kubernetes.io/projected/e618b793-2764-4b56-a89e-a9ad8a3e9a94-kube-api-access-hxvd7\") pod \"redhat-operators-tr2lk\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:35 crc kubenswrapper[4907]: I1009 20:17:35.791126 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.268869 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tr2lk"] Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.299227 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.299296 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.299350 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.300339 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.300398 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" gracePeriod=600 Oct 09 20:17:36 crc kubenswrapper[4907]: E1009 20:17:36.425016 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.846898 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" exitCode=0 Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.846956 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c"} Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.848370 4907 scope.go:117] "RemoveContainer" containerID="557db504600f3c41488c9227100b74382f080751a479b2a9db5ed710b0f070c4" Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.849149 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:17:36 crc kubenswrapper[4907]: E1009 20:17:36.849644 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.851287 4907 generic.go:334] "Generic (PLEG): container finished" podID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerID="fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca" exitCode=0 Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.851334 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr2lk" event={"ID":"e618b793-2764-4b56-a89e-a9ad8a3e9a94","Type":"ContainerDied","Data":"fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca"} Oct 09 20:17:36 crc kubenswrapper[4907]: I1009 20:17:36.851369 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr2lk" event={"ID":"e618b793-2764-4b56-a89e-a9ad8a3e9a94","Type":"ContainerStarted","Data":"59d204bd65bfeb5fd2bbb10e26f8b2dad17926bdabdf89b199c8038a907bda7a"} Oct 09 20:17:37 crc kubenswrapper[4907]: I1009 20:17:37.863880 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr2lk" event={"ID":"e618b793-2764-4b56-a89e-a9ad8a3e9a94","Type":"ContainerStarted","Data":"6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148"} Oct 09 20:17:42 crc kubenswrapper[4907]: I1009 20:17:42.938648 4907 generic.go:334] "Generic (PLEG): container finished" podID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerID="6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148" exitCode=0 Oct 09 20:17:42 crc kubenswrapper[4907]: I1009 20:17:42.938808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr2lk" event={"ID":"e618b793-2764-4b56-a89e-a9ad8a3e9a94","Type":"ContainerDied","Data":"6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148"} Oct 09 20:17:43 crc kubenswrapper[4907]: I1009 20:17:43.954783 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr2lk" event={"ID":"e618b793-2764-4b56-a89e-a9ad8a3e9a94","Type":"ContainerStarted","Data":"f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc"} Oct 09 20:17:43 crc kubenswrapper[4907]: I1009 20:17:43.979127 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tr2lk" podStartSLOduration=2.467558233 podStartE2EDuration="8.979106177s" podCreationTimestamp="2025-10-09 20:17:35 +0000 UTC" firstStartedPulling="2025-10-09 20:17:36.853685123 +0000 UTC m=+2942.385652612" lastFinishedPulling="2025-10-09 20:17:43.365233067 +0000 UTC m=+2948.897200556" observedRunningTime="2025-10-09 20:17:43.977299372 +0000 UTC m=+2949.509266851" watchObservedRunningTime="2025-10-09 20:17:43.979106177 +0000 UTC m=+2949.511073676" Oct 09 20:17:45 crc kubenswrapper[4907]: I1009 20:17:45.791596 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:45 crc kubenswrapper[4907]: I1009 20:17:45.791939 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.460876 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dg7rr"] Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.462880 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.477630 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dg7rr"] Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.586204 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-utilities\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.586394 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-catalog-content\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.586635 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbwp\" (UniqueName: \"kubernetes.io/projected/5395cd23-83ce-428d-a887-e378a756404c-kube-api-access-plbwp\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.688558 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-catalog-content\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.688622 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plbwp\" (UniqueName: \"kubernetes.io/projected/5395cd23-83ce-428d-a887-e378a756404c-kube-api-access-plbwp\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.688731 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-utilities\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.689047 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-catalog-content\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.689093 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-utilities\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.714014 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbwp\" (UniqueName: \"kubernetes.io/projected/5395cd23-83ce-428d-a887-e378a756404c-kube-api-access-plbwp\") pod \"community-operators-dg7rr\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.785596 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:46 crc kubenswrapper[4907]: I1009 20:17:46.866734 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tr2lk" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="registry-server" probeResult="failure" output=< Oct 09 20:17:46 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Oct 09 20:17:46 crc kubenswrapper[4907]: > Oct 09 20:17:47 crc kubenswrapper[4907]: I1009 20:17:47.359319 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dg7rr"] Oct 09 20:17:47 crc kubenswrapper[4907]: W1009 20:17:47.368758 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5395cd23_83ce_428d_a887_e378a756404c.slice/crio-2a3f5d48abb01e62af212a06ffa7740a5c81fb1e1feaf7b97c4422484e723b5d WatchSource:0}: Error finding container 2a3f5d48abb01e62af212a06ffa7740a5c81fb1e1feaf7b97c4422484e723b5d: Status 404 returned error can't find the container with id 2a3f5d48abb01e62af212a06ffa7740a5c81fb1e1feaf7b97c4422484e723b5d Oct 09 20:17:48 crc kubenswrapper[4907]: I1009 20:17:48.008369 4907 generic.go:334] "Generic (PLEG): container finished" podID="5395cd23-83ce-428d-a887-e378a756404c" containerID="20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9" exitCode=0 Oct 09 20:17:48 crc kubenswrapper[4907]: I1009 20:17:48.008628 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg7rr" event={"ID":"5395cd23-83ce-428d-a887-e378a756404c","Type":"ContainerDied","Data":"20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9"} Oct 09 20:17:48 crc kubenswrapper[4907]: I1009 20:17:48.008653 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg7rr" event={"ID":"5395cd23-83ce-428d-a887-e378a756404c","Type":"ContainerStarted","Data":"2a3f5d48abb01e62af212a06ffa7740a5c81fb1e1feaf7b97c4422484e723b5d"} Oct 09 20:17:49 crc kubenswrapper[4907]: I1009 20:17:49.019091 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg7rr" event={"ID":"5395cd23-83ce-428d-a887-e378a756404c","Type":"ContainerStarted","Data":"568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d"} Oct 09 20:17:49 crc kubenswrapper[4907]: I1009 20:17:49.152004 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:17:49 crc kubenswrapper[4907]: E1009 20:17:49.152312 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:17:51 crc kubenswrapper[4907]: I1009 20:17:51.042575 4907 generic.go:334] "Generic (PLEG): container finished" podID="5395cd23-83ce-428d-a887-e378a756404c" containerID="568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d" exitCode=0 Oct 09 20:17:51 crc kubenswrapper[4907]: I1009 20:17:51.042616 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg7rr" event={"ID":"5395cd23-83ce-428d-a887-e378a756404c","Type":"ContainerDied","Data":"568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d"} Oct 09 20:17:52 crc kubenswrapper[4907]: I1009 20:17:52.068917 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg7rr" event={"ID":"5395cd23-83ce-428d-a887-e378a756404c","Type":"ContainerStarted","Data":"7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3"} Oct 09 20:17:52 crc kubenswrapper[4907]: I1009 20:17:52.092587 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dg7rr" podStartSLOduration=2.56759115 podStartE2EDuration="6.092563012s" podCreationTimestamp="2025-10-09 20:17:46 +0000 UTC" firstStartedPulling="2025-10-09 20:17:48.010509324 +0000 UTC m=+2953.542476813" lastFinishedPulling="2025-10-09 20:17:51.535481186 +0000 UTC m=+2957.067448675" observedRunningTime="2025-10-09 20:17:52.084903161 +0000 UTC m=+2957.616870650" watchObservedRunningTime="2025-10-09 20:17:52.092563012 +0000 UTC m=+2957.624530501" Oct 09 20:17:56 crc kubenswrapper[4907]: I1009 20:17:56.785922 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:56 crc kubenswrapper[4907]: I1009 20:17:56.786600 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:56 crc kubenswrapper[4907]: I1009 20:17:56.836557 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tr2lk" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="registry-server" probeResult="failure" output=< Oct 09 20:17:56 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Oct 09 20:17:56 crc kubenswrapper[4907]: > Oct 09 20:17:56 crc kubenswrapper[4907]: I1009 20:17:56.838949 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:57 crc kubenswrapper[4907]: I1009 20:17:57.211282 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:57 crc kubenswrapper[4907]: I1009 20:17:57.266362 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dg7rr"] Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.154694 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dg7rr" podUID="5395cd23-83ce-428d-a887-e378a756404c" containerName="registry-server" containerID="cri-o://7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3" gracePeriod=2 Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.743947 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.891067 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plbwp\" (UniqueName: \"kubernetes.io/projected/5395cd23-83ce-428d-a887-e378a756404c-kube-api-access-plbwp\") pod \"5395cd23-83ce-428d-a887-e378a756404c\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.891279 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-catalog-content\") pod \"5395cd23-83ce-428d-a887-e378a756404c\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.891390 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-utilities\") pod \"5395cd23-83ce-428d-a887-e378a756404c\" (UID: \"5395cd23-83ce-428d-a887-e378a756404c\") " Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.892003 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-utilities" (OuterVolumeSpecName: "utilities") pod "5395cd23-83ce-428d-a887-e378a756404c" (UID: "5395cd23-83ce-428d-a887-e378a756404c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.897675 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5395cd23-83ce-428d-a887-e378a756404c-kube-api-access-plbwp" (OuterVolumeSpecName: "kube-api-access-plbwp") pod "5395cd23-83ce-428d-a887-e378a756404c" (UID: "5395cd23-83ce-428d-a887-e378a756404c"). InnerVolumeSpecName "kube-api-access-plbwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.934556 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5395cd23-83ce-428d-a887-e378a756404c" (UID: "5395cd23-83ce-428d-a887-e378a756404c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.993495 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plbwp\" (UniqueName: \"kubernetes.io/projected/5395cd23-83ce-428d-a887-e378a756404c-kube-api-access-plbwp\") on node \"crc\" DevicePath \"\"" Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.993732 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:17:59 crc kubenswrapper[4907]: I1009 20:17:59.993808 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5395cd23-83ce-428d-a887-e378a756404c-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.169712 4907 generic.go:334] "Generic (PLEG): container finished" podID="5395cd23-83ce-428d-a887-e378a756404c" containerID="7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3" exitCode=0 Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.169761 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg7rr" event={"ID":"5395cd23-83ce-428d-a887-e378a756404c","Type":"ContainerDied","Data":"7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3"} Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.169792 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dg7rr" event={"ID":"5395cd23-83ce-428d-a887-e378a756404c","Type":"ContainerDied","Data":"2a3f5d48abb01e62af212a06ffa7740a5c81fb1e1feaf7b97c4422484e723b5d"} Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.169813 4907 scope.go:117] "RemoveContainer" containerID="7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.169966 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dg7rr" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.206329 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dg7rr"] Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.209042 4907 scope.go:117] "RemoveContainer" containerID="568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.220609 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dg7rr"] Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.246267 4907 scope.go:117] "RemoveContainer" containerID="20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.289585 4907 scope.go:117] "RemoveContainer" containerID="7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3" Oct 09 20:18:00 crc kubenswrapper[4907]: E1009 20:18:00.290126 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3\": container with ID starting with 7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3 not found: ID does not exist" containerID="7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.290163 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3"} err="failed to get container status \"7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3\": rpc error: code = NotFound desc = could not find container \"7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3\": container with ID starting with 7cb1425eeda254d3e0447c65bbd55277189586e81f2804e22de174ed45c681a3 not found: ID does not exist" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.290189 4907 scope.go:117] "RemoveContainer" containerID="568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d" Oct 09 20:18:00 crc kubenswrapper[4907]: E1009 20:18:00.290662 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d\": container with ID starting with 568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d not found: ID does not exist" containerID="568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.290688 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d"} err="failed to get container status \"568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d\": rpc error: code = NotFound desc = could not find container \"568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d\": container with ID starting with 568a31166713cd1f9ddd1c4b410513086c9186e2f5dbac3d81e5ef6e9e97e72d not found: ID does not exist" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.290707 4907 scope.go:117] "RemoveContainer" containerID="20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9" Oct 09 20:18:00 crc kubenswrapper[4907]: E1009 20:18:00.291008 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9\": container with ID starting with 20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9 not found: ID does not exist" containerID="20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9" Oct 09 20:18:00 crc kubenswrapper[4907]: I1009 20:18:00.291034 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9"} err="failed to get container status \"20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9\": rpc error: code = NotFound desc = could not find container \"20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9\": container with ID starting with 20dcb9e0d5db6491fef94f92e4914854c5907103c7be0a70f1f3cb3fd365e2b9 not found: ID does not exist" Oct 09 20:18:01 crc kubenswrapper[4907]: I1009 20:18:01.152167 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:18:01 crc kubenswrapper[4907]: E1009 20:18:01.152442 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:18:01 crc kubenswrapper[4907]: I1009 20:18:01.172760 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5395cd23-83ce-428d-a887-e378a756404c" path="/var/lib/kubelet/pods/5395cd23-83ce-428d-a887-e378a756404c/volumes" Oct 09 20:18:06 crc kubenswrapper[4907]: I1009 20:18:06.849142 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tr2lk" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="registry-server" probeResult="failure" output=< Oct 09 20:18:06 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Oct 09 20:18:06 crc kubenswrapper[4907]: > Oct 09 20:18:12 crc kubenswrapper[4907]: I1009 20:18:12.151832 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:18:12 crc kubenswrapper[4907]: E1009 20:18:12.152661 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:18:15 crc kubenswrapper[4907]: I1009 20:18:15.865754 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:18:15 crc kubenswrapper[4907]: I1009 20:18:15.960241 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:18:16 crc kubenswrapper[4907]: I1009 20:18:16.121392 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tr2lk"] Oct 09 20:18:17 crc kubenswrapper[4907]: I1009 20:18:17.379299 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tr2lk" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="registry-server" containerID="cri-o://f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc" gracePeriod=2 Oct 09 20:18:17 crc kubenswrapper[4907]: I1009 20:18:17.957301 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.135958 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-utilities\") pod \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.136336 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxvd7\" (UniqueName: \"kubernetes.io/projected/e618b793-2764-4b56-a89e-a9ad8a3e9a94-kube-api-access-hxvd7\") pod \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.136447 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-catalog-content\") pod \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\" (UID: \"e618b793-2764-4b56-a89e-a9ad8a3e9a94\") " Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.136784 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-utilities" (OuterVolumeSpecName: "utilities") pod "e618b793-2764-4b56-a89e-a9ad8a3e9a94" (UID: "e618b793-2764-4b56-a89e-a9ad8a3e9a94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.137458 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.148147 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e618b793-2764-4b56-a89e-a9ad8a3e9a94-kube-api-access-hxvd7" (OuterVolumeSpecName: "kube-api-access-hxvd7") pod "e618b793-2764-4b56-a89e-a9ad8a3e9a94" (UID: "e618b793-2764-4b56-a89e-a9ad8a3e9a94"). InnerVolumeSpecName "kube-api-access-hxvd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.229336 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e618b793-2764-4b56-a89e-a9ad8a3e9a94" (UID: "e618b793-2764-4b56-a89e-a9ad8a3e9a94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.239914 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxvd7\" (UniqueName: \"kubernetes.io/projected/e618b793-2764-4b56-a89e-a9ad8a3e9a94-kube-api-access-hxvd7\") on node \"crc\" DevicePath \"\"" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.239965 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e618b793-2764-4b56-a89e-a9ad8a3e9a94-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.392898 4907 generic.go:334] "Generic (PLEG): container finished" podID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerID="f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc" exitCode=0 Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.392965 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr2lk" event={"ID":"e618b793-2764-4b56-a89e-a9ad8a3e9a94","Type":"ContainerDied","Data":"f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc"} Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.393032 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr2lk" event={"ID":"e618b793-2764-4b56-a89e-a9ad8a3e9a94","Type":"ContainerDied","Data":"59d204bd65bfeb5fd2bbb10e26f8b2dad17926bdabdf89b199c8038a907bda7a"} Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.393052 4907 scope.go:117] "RemoveContainer" containerID="f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.393053 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr2lk" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.418521 4907 scope.go:117] "RemoveContainer" containerID="6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.456900 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tr2lk"] Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.469456 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tr2lk"] Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.476814 4907 scope.go:117] "RemoveContainer" containerID="fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.500247 4907 scope.go:117] "RemoveContainer" containerID="f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc" Oct 09 20:18:18 crc kubenswrapper[4907]: E1009 20:18:18.500747 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc\": container with ID starting with f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc not found: ID does not exist" containerID="f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.500779 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc"} err="failed to get container status \"f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc\": rpc error: code = NotFound desc = could not find container \"f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc\": container with ID starting with f43698be177223adf7e301bce8259c0abc16a6a6f64e2feeabc438a8c241c7bc not found: ID does not exist" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.500802 4907 scope.go:117] "RemoveContainer" containerID="6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148" Oct 09 20:18:18 crc kubenswrapper[4907]: E1009 20:18:18.501101 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148\": container with ID starting with 6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148 not found: ID does not exist" containerID="6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.501120 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148"} err="failed to get container status \"6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148\": rpc error: code = NotFound desc = could not find container \"6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148\": container with ID starting with 6a4fd2735618bcbd5a443788e4495a73f05aa82c63f79c6effb3d83696804148 not found: ID does not exist" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.501133 4907 scope.go:117] "RemoveContainer" containerID="fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca" Oct 09 20:18:18 crc kubenswrapper[4907]: E1009 20:18:18.501549 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca\": container with ID starting with fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca not found: ID does not exist" containerID="fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca" Oct 09 20:18:18 crc kubenswrapper[4907]: I1009 20:18:18.501675 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca"} err="failed to get container status \"fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca\": rpc error: code = NotFound desc = could not find container \"fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca\": container with ID starting with fd5060db56a2f141c6d9dcc1e040d4e9d1470f64f114b46faaf2ff8a006efbca not found: ID does not exist" Oct 09 20:18:19 crc kubenswrapper[4907]: I1009 20:18:19.173423 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" path="/var/lib/kubelet/pods/e618b793-2764-4b56-a89e-a9ad8a3e9a94/volumes" Oct 09 20:18:23 crc kubenswrapper[4907]: I1009 20:18:23.151633 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:18:23 crc kubenswrapper[4907]: E1009 20:18:23.152381 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:18:36 crc kubenswrapper[4907]: I1009 20:18:36.151383 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:18:36 crc kubenswrapper[4907]: E1009 20:18:36.152052 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:18:50 crc kubenswrapper[4907]: I1009 20:18:50.152258 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:18:50 crc kubenswrapper[4907]: E1009 20:18:50.153246 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:19:02 crc kubenswrapper[4907]: I1009 20:19:02.151835 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:19:02 crc kubenswrapper[4907]: E1009 20:19:02.152759 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:19:15 crc kubenswrapper[4907]: I1009 20:19:15.159545 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:19:15 crc kubenswrapper[4907]: E1009 20:19:15.165625 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:19:23 crc kubenswrapper[4907]: I1009 20:19:23.140913 4907 generic.go:334] "Generic (PLEG): container finished" podID="92e389b9-a749-4f7e-9c0a-3c901329ff51" containerID="1cc183aa4a4d03286d8d94be68e4bc02f11513f0d8e6b696dbe548d3810980f6" exitCode=2 Oct 09 20:19:23 crc kubenswrapper[4907]: I1009 20:19:23.141042 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" event={"ID":"92e389b9-a749-4f7e-9c0a-3c901329ff51","Type":"ContainerDied","Data":"1cc183aa4a4d03286d8d94be68e4bc02f11513f0d8e6b696dbe548d3810980f6"} Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.636965 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.707730 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-workload-ssh-secret\") pod \"92e389b9-a749-4f7e-9c0a-3c901329ff51\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.708038 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-compute-ssh-secret\") pod \"92e389b9-a749-4f7e-9c0a-3c901329ff51\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.708274 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\") pod \"92e389b9-a749-4f7e-9c0a-3c901329ff51\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.708345 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-temporary\") pod \"92e389b9-a749-4f7e-9c0a-3c901329ff51\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.708422 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-ca-certs\") pod \"92e389b9-a749-4f7e-9c0a-3c901329ff51\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.708491 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config-secret\") pod \"92e389b9-a749-4f7e-9c0a-3c901329ff51\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.708558 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-workdir\") pod \"92e389b9-a749-4f7e-9c0a-3c901329ff51\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.708618 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config\") pod \"92e389b9-a749-4f7e-9c0a-3c901329ff51\" (UID: \"92e389b9-a749-4f7e-9c0a-3c901329ff51\") " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.708901 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "92e389b9-a749-4f7e-9c0a-3c901329ff51" (UID: "92e389b9-a749-4f7e-9c0a-3c901329ff51"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.709383 4907 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.735644 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5" (OuterVolumeSpecName: "test-operator-logs") pod "92e389b9-a749-4f7e-9c0a-3c901329ff51" (UID: "92e389b9-a749-4f7e-9c0a-3c901329ff51"). InnerVolumeSpecName "pvc-173cbc7b-f958-457c-9718-7cb37e3552b5". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.748773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-compute-ssh-secret" (OuterVolumeSpecName: "compute-ssh-secret") pod "92e389b9-a749-4f7e-9c0a-3c901329ff51" (UID: "92e389b9-a749-4f7e-9c0a-3c901329ff51"). InnerVolumeSpecName "compute-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.754420 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-workload-ssh-secret" (OuterVolumeSpecName: "workload-ssh-secret") pod "92e389b9-a749-4f7e-9c0a-3c901329ff51" (UID: "92e389b9-a749-4f7e-9c0a-3c901329ff51"). InnerVolumeSpecName "workload-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.771964 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "92e389b9-a749-4f7e-9c0a-3c901329ff51" (UID: "92e389b9-a749-4f7e-9c0a-3c901329ff51"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.779296 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "92e389b9-a749-4f7e-9c0a-3c901329ff51" (UID: "92e389b9-a749-4f7e-9c0a-3c901329ff51"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.782661 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "92e389b9-a749-4f7e-9c0a-3c901329ff51" (UID: "92e389b9-a749-4f7e-9c0a-3c901329ff51"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.801669 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "92e389b9-a749-4f7e-9c0a-3c901329ff51" (UID: "92e389b9-a749-4f7e-9c0a-3c901329ff51"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.811612 4907 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/92e389b9-a749-4f7e-9c0a-3c901329ff51-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.811651 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.811667 4907 reconciler_common.go:293] "Volume detached for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-workload-ssh-secret\") on node \"crc\" DevicePath \"\"" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.811679 4907 reconciler_common.go:293] "Volume detached for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-compute-ssh-secret\") on node \"crc\" DevicePath \"\"" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.811730 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\") on node \"crc\" " Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.811746 4907 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.811760 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/92e389b9-a749-4f7e-9c0a-3c901329ff51-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.857408 4907 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.857586 4907 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-173cbc7b-f958-457c-9718-7cb37e3552b5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5") on node "crc" Oct 09 20:19:24 crc kubenswrapper[4907]: I1009 20:19:24.913330 4907 reconciler_common.go:293] "Volume detached for volume \"pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-173cbc7b-f958-457c-9718-7cb37e3552b5\") on node \"crc\" DevicePath \"\"" Oct 09 20:19:25 crc kubenswrapper[4907]: I1009 20:19:25.163432 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" Oct 09 20:19:25 crc kubenswrapper[4907]: I1009 20:19:25.164052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansible-tests-cloudkitty-s00-cloudkitty" event={"ID":"92e389b9-a749-4f7e-9c0a-3c901329ff51","Type":"ContainerDied","Data":"d7999b012b0601e398c87bd99c22c088018a202b2a292279688ce95958b59a6c"} Oct 09 20:19:25 crc kubenswrapper[4907]: I1009 20:19:25.164079 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7999b012b0601e398c87bd99c22c088018a202b2a292279688ce95958b59a6c" Oct 09 20:19:28 crc kubenswrapper[4907]: I1009 20:19:28.153031 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:19:28 crc kubenswrapper[4907]: E1009 20:19:28.154319 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:19:40 crc kubenswrapper[4907]: I1009 20:19:40.152385 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:19:40 crc kubenswrapper[4907]: E1009 20:19:40.153799 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:19:53 crc kubenswrapper[4907]: I1009 20:19:53.152208 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:19:53 crc kubenswrapper[4907]: E1009 20:19:53.152952 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:20:04 crc kubenswrapper[4907]: I1009 20:20:04.152094 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:20:04 crc kubenswrapper[4907]: E1009 20:20:04.154041 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.322250 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 20:20:07 crc kubenswrapper[4907]: E1009 20:20:07.323193 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="extract-utilities" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323208 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="extract-utilities" Oct 09 20:20:07 crc kubenswrapper[4907]: E1009 20:20:07.323229 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5395cd23-83ce-428d-a887-e378a756404c" containerName="extract-utilities" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323236 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5395cd23-83ce-428d-a887-e378a756404c" containerName="extract-utilities" Oct 09 20:20:07 crc kubenswrapper[4907]: E1009 20:20:07.323256 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5395cd23-83ce-428d-a887-e378a756404c" containerName="extract-content" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323261 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5395cd23-83ce-428d-a887-e378a756404c" containerName="extract-content" Oct 09 20:20:07 crc kubenswrapper[4907]: E1009 20:20:07.323278 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e389b9-a749-4f7e-9c0a-3c901329ff51" containerName="ansible-tests-cloudkitty" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323284 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e389b9-a749-4f7e-9c0a-3c901329ff51" containerName="ansible-tests-cloudkitty" Oct 09 20:20:07 crc kubenswrapper[4907]: E1009 20:20:07.323295 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5395cd23-83ce-428d-a887-e378a756404c" containerName="registry-server" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323300 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5395cd23-83ce-428d-a887-e378a756404c" containerName="registry-server" Oct 09 20:20:07 crc kubenswrapper[4907]: E1009 20:20:07.323311 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="extract-content" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323318 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="extract-content" Oct 09 20:20:07 crc kubenswrapper[4907]: E1009 20:20:07.323326 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="registry-server" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323332 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="registry-server" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323532 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e389b9-a749-4f7e-9c0a-3c901329ff51" containerName="ansible-tests-cloudkitty" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323550 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5395cd23-83ce-428d-a887-e378a756404c" containerName="registry-server" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.323557 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e618b793-2764-4b56-a89e-a9ad8a3e9a94" containerName="registry-server" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.324219 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.326345 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-f86fh" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.326673 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.326850 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.329823 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.353222 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456022 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456098 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zshrp\" (UniqueName: \"kubernetes.io/projected/1049514e-2b9c-426e-9534-677c595d39d8-kube-api-access-zshrp\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456181 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-config-data\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456205 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456223 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456321 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456452 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456545 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.456653 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.558348 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-config-data\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.558965 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.559011 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.559056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.559126 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.559164 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.559508 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.559666 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.559729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zshrp\" (UniqueName: \"kubernetes.io/projected/1049514e-2b9c-426e-9534-677c595d39d8-kube-api-access-zshrp\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.560390 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.560423 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.560767 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.561112 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.561605 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-config-data\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.567884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.568286 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.569097 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.586680 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zshrp\" (UniqueName: \"kubernetes.io/projected/1049514e-2b9c-426e-9534-677c595d39d8-kube-api-access-zshrp\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.626106 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " pod="openstack/tempest-tests-tempest" Oct 09 20:20:07 crc kubenswrapper[4907]: I1009 20:20:07.652904 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 20:20:08 crc kubenswrapper[4907]: I1009 20:20:08.173381 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 20:20:08 crc kubenswrapper[4907]: I1009 20:20:08.663589 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1049514e-2b9c-426e-9534-677c595d39d8","Type":"ContainerStarted","Data":"1a67fb7d6414ded3a60c4a0027e5e0a5cc07cf00da1a0f5974bf99dc2fa2c4e4"} Oct 09 20:20:19 crc kubenswrapper[4907]: I1009 20:20:19.152301 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:20:19 crc kubenswrapper[4907]: E1009 20:20:19.153166 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:20:30 crc kubenswrapper[4907]: I1009 20:20:30.151655 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:20:30 crc kubenswrapper[4907]: E1009 20:20:30.152382 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:20:33 crc kubenswrapper[4907]: E1009 20:20:33.807159 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 09 20:20:33 crc kubenswrapper[4907]: E1009 20:20:33.807914 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zshrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(1049514e-2b9c-426e-9534-677c595d39d8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 20:20:33 crc kubenswrapper[4907]: E1009 20:20:33.809607 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="1049514e-2b9c-426e-9534-677c595d39d8" Oct 09 20:20:33 crc kubenswrapper[4907]: E1009 20:20:33.939210 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="1049514e-2b9c-426e-9534-677c595d39d8" Oct 09 20:20:41 crc kubenswrapper[4907]: I1009 20:20:41.803194 4907 scope.go:117] "RemoveContainer" containerID="f48d9e50a44d366055ca54facb6090a89759038d0919ff38c294907c7a9a275c" Oct 09 20:20:41 crc kubenswrapper[4907]: I1009 20:20:41.841891 4907 scope.go:117] "RemoveContainer" containerID="d33783f2bbe5ebf346dd7b6037dd0bf920df86595c40c175ce1c769027602c9b" Oct 09 20:20:41 crc kubenswrapper[4907]: I1009 20:20:41.868185 4907 scope.go:117] "RemoveContainer" containerID="1725b96181c6e970bfd9df43acd134327389c5592a6be148c16b2d384f29700f" Oct 09 20:20:41 crc kubenswrapper[4907]: I1009 20:20:41.887449 4907 scope.go:117] "RemoveContainer" containerID="2211b364fea9c948032728c78597a6a3f9a058fe6967d914af4c36ac26ba0e56" Oct 09 20:20:43 crc kubenswrapper[4907]: I1009 20:20:43.151307 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:20:43 crc kubenswrapper[4907]: E1009 20:20:43.151984 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:20:48 crc kubenswrapper[4907]: I1009 20:20:48.610703 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 09 20:20:50 crc kubenswrapper[4907]: I1009 20:20:50.112529 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1049514e-2b9c-426e-9534-677c595d39d8","Type":"ContainerStarted","Data":"34c9a5bdb7fbfd8cff047f6e916b82889894cc595f0a38632fb38591f8bb5666"} Oct 09 20:20:50 crc kubenswrapper[4907]: I1009 20:20:50.157269 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.737386197 podStartE2EDuration="44.15724984s" podCreationTimestamp="2025-10-09 20:20:06 +0000 UTC" firstStartedPulling="2025-10-09 20:20:08.187933432 +0000 UTC m=+3093.719900941" lastFinishedPulling="2025-10-09 20:20:48.607797095 +0000 UTC m=+3134.139764584" observedRunningTime="2025-10-09 20:20:50.141717063 +0000 UTC m=+3135.673684562" watchObservedRunningTime="2025-10-09 20:20:50.15724984 +0000 UTC m=+3135.689217329" Oct 09 20:20:58 crc kubenswrapper[4907]: I1009 20:20:58.152027 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:20:58 crc kubenswrapper[4907]: E1009 20:20:58.152853 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:21:11 crc kubenswrapper[4907]: I1009 20:21:11.151214 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:21:11 crc kubenswrapper[4907]: E1009 20:21:11.151858 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:21:24 crc kubenswrapper[4907]: I1009 20:21:24.151371 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:21:24 crc kubenswrapper[4907]: E1009 20:21:24.152406 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:21:38 crc kubenswrapper[4907]: I1009 20:21:38.151242 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:21:38 crc kubenswrapper[4907]: E1009 20:21:38.152056 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:21:49 crc kubenswrapper[4907]: I1009 20:21:49.151984 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:21:49 crc kubenswrapper[4907]: E1009 20:21:49.152908 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:22:00 crc kubenswrapper[4907]: I1009 20:22:00.151537 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:22:00 crc kubenswrapper[4907]: E1009 20:22:00.152112 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:22:11 crc kubenswrapper[4907]: I1009 20:22:11.151657 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:22:11 crc kubenswrapper[4907]: E1009 20:22:11.152385 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:22:25 crc kubenswrapper[4907]: I1009 20:22:25.164426 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:22:25 crc kubenswrapper[4907]: E1009 20:22:25.165201 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:22:39 crc kubenswrapper[4907]: I1009 20:22:39.151169 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:22:40 crc kubenswrapper[4907]: I1009 20:22:40.329736 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"96432ef9376f05f960921874ca8f70e980e9749884376496546f629d7367aada"} Oct 09 20:24:45 crc kubenswrapper[4907]: I1009 20:24:45.059730 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-lb67g"] Oct 09 20:24:45 crc kubenswrapper[4907]: I1009 20:24:45.077388 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-lb67g"] Oct 09 20:24:45 crc kubenswrapper[4907]: I1009 20:24:45.198995 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e424b2b3-47b9-4089-bef8-24998fb2d49e" path="/var/lib/kubelet/pods/e424b2b3-47b9-4089-bef8-24998fb2d49e/volumes" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.381678 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8rhdj"] Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.385748 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.402133 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rhdj"] Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.541859 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdrf\" (UniqueName: \"kubernetes.io/projected/818205e7-2675-44eb-a9fb-e7834ac224d3-kube-api-access-6tdrf\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.542314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-utilities\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.542336 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-catalog-content\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.643947 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdrf\" (UniqueName: \"kubernetes.io/projected/818205e7-2675-44eb-a9fb-e7834ac224d3-kube-api-access-6tdrf\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.644181 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-utilities\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.644207 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-catalog-content\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.644729 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-utilities\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.644789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-catalog-content\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.664236 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdrf\" (UniqueName: \"kubernetes.io/projected/818205e7-2675-44eb-a9fb-e7834ac224d3-kube-api-access-6tdrf\") pod \"redhat-marketplace-8rhdj\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:56 crc kubenswrapper[4907]: I1009 20:24:56.769574 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:24:57 crc kubenswrapper[4907]: I1009 20:24:57.228994 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rhdj"] Oct 09 20:24:57 crc kubenswrapper[4907]: I1009 20:24:57.756462 4907 generic.go:334] "Generic (PLEG): container finished" podID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerID="15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1" exitCode=0 Oct 09 20:24:57 crc kubenswrapper[4907]: I1009 20:24:57.756576 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rhdj" event={"ID":"818205e7-2675-44eb-a9fb-e7834ac224d3","Type":"ContainerDied","Data":"15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1"} Oct 09 20:24:57 crc kubenswrapper[4907]: I1009 20:24:57.756799 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rhdj" event={"ID":"818205e7-2675-44eb-a9fb-e7834ac224d3","Type":"ContainerStarted","Data":"b225e1f2c70037b4d2c2414fa38c3ecb22a83f3926c9f5e450938ddc1c3434e2"} Oct 09 20:24:57 crc kubenswrapper[4907]: I1009 20:24:57.761210 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 20:24:59 crc kubenswrapper[4907]: I1009 20:24:59.803771 4907 generic.go:334] "Generic (PLEG): container finished" podID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerID="1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8" exitCode=0 Oct 09 20:24:59 crc kubenswrapper[4907]: I1009 20:24:59.804681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rhdj" event={"ID":"818205e7-2675-44eb-a9fb-e7834ac224d3","Type":"ContainerDied","Data":"1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8"} Oct 09 20:25:00 crc kubenswrapper[4907]: I1009 20:25:00.818648 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rhdj" event={"ID":"818205e7-2675-44eb-a9fb-e7834ac224d3","Type":"ContainerStarted","Data":"3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045"} Oct 09 20:25:00 crc kubenswrapper[4907]: I1009 20:25:00.853547 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8rhdj" podStartSLOduration=2.178118235 podStartE2EDuration="4.85352468s" podCreationTimestamp="2025-10-09 20:24:56 +0000 UTC" firstStartedPulling="2025-10-09 20:24:57.760782915 +0000 UTC m=+3383.292750444" lastFinishedPulling="2025-10-09 20:25:00.4361894 +0000 UTC m=+3385.968156889" observedRunningTime="2025-10-09 20:25:00.843513261 +0000 UTC m=+3386.375480750" watchObservedRunningTime="2025-10-09 20:25:00.85352468 +0000 UTC m=+3386.385492169" Oct 09 20:25:04 crc kubenswrapper[4907]: I1009 20:25:04.069710 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-0e83-account-create-54jfg"] Oct 09 20:25:04 crc kubenswrapper[4907]: I1009 20:25:04.087800 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-0e83-account-create-54jfg"] Oct 09 20:25:05 crc kubenswrapper[4907]: I1009 20:25:05.170646 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8acc1c-79a2-43fc-9493-5baff3406366" path="/var/lib/kubelet/pods/7f8acc1c-79a2-43fc-9493-5baff3406366/volumes" Oct 09 20:25:06 crc kubenswrapper[4907]: I1009 20:25:06.299441 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:25:06 crc kubenswrapper[4907]: I1009 20:25:06.299811 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:25:06 crc kubenswrapper[4907]: I1009 20:25:06.770224 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:25:06 crc kubenswrapper[4907]: I1009 20:25:06.770281 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:25:06 crc kubenswrapper[4907]: I1009 20:25:06.824129 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:25:06 crc kubenswrapper[4907]: I1009 20:25:06.925970 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:25:07 crc kubenswrapper[4907]: I1009 20:25:07.061765 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rhdj"] Oct 09 20:25:08 crc kubenswrapper[4907]: I1009 20:25:08.891082 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8rhdj" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerName="registry-server" containerID="cri-o://3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045" gracePeriod=2 Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.580787 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.730283 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tdrf\" (UniqueName: \"kubernetes.io/projected/818205e7-2675-44eb-a9fb-e7834ac224d3-kube-api-access-6tdrf\") pod \"818205e7-2675-44eb-a9fb-e7834ac224d3\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.730415 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-utilities\") pod \"818205e7-2675-44eb-a9fb-e7834ac224d3\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.730506 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-catalog-content\") pod \"818205e7-2675-44eb-a9fb-e7834ac224d3\" (UID: \"818205e7-2675-44eb-a9fb-e7834ac224d3\") " Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.731215 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-utilities" (OuterVolumeSpecName: "utilities") pod "818205e7-2675-44eb-a9fb-e7834ac224d3" (UID: "818205e7-2675-44eb-a9fb-e7834ac224d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.736444 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818205e7-2675-44eb-a9fb-e7834ac224d3-kube-api-access-6tdrf" (OuterVolumeSpecName: "kube-api-access-6tdrf") pod "818205e7-2675-44eb-a9fb-e7834ac224d3" (UID: "818205e7-2675-44eb-a9fb-e7834ac224d3"). InnerVolumeSpecName "kube-api-access-6tdrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.745035 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "818205e7-2675-44eb-a9fb-e7834ac224d3" (UID: "818205e7-2675-44eb-a9fb-e7834ac224d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.832610 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.832639 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/818205e7-2675-44eb-a9fb-e7834ac224d3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.832653 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tdrf\" (UniqueName: \"kubernetes.io/projected/818205e7-2675-44eb-a9fb-e7834ac224d3-kube-api-access-6tdrf\") on node \"crc\" DevicePath \"\"" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.903395 4907 generic.go:334] "Generic (PLEG): container finished" podID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerID="3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045" exitCode=0 Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.903458 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rhdj" event={"ID":"818205e7-2675-44eb-a9fb-e7834ac224d3","Type":"ContainerDied","Data":"3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045"} Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.903533 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8rhdj" event={"ID":"818205e7-2675-44eb-a9fb-e7834ac224d3","Type":"ContainerDied","Data":"b225e1f2c70037b4d2c2414fa38c3ecb22a83f3926c9f5e450938ddc1c3434e2"} Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.903555 4907 scope.go:117] "RemoveContainer" containerID="3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.903622 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8rhdj" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.927069 4907 scope.go:117] "RemoveContainer" containerID="1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.960534 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rhdj"] Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.960596 4907 scope.go:117] "RemoveContainer" containerID="15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1" Oct 09 20:25:09 crc kubenswrapper[4907]: I1009 20:25:09.972755 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8rhdj"] Oct 09 20:25:10 crc kubenswrapper[4907]: I1009 20:25:10.014881 4907 scope.go:117] "RemoveContainer" containerID="3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045" Oct 09 20:25:10 crc kubenswrapper[4907]: E1009 20:25:10.015450 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045\": container with ID starting with 3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045 not found: ID does not exist" containerID="3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045" Oct 09 20:25:10 crc kubenswrapper[4907]: I1009 20:25:10.015498 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045"} err="failed to get container status \"3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045\": rpc error: code = NotFound desc = could not find container \"3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045\": container with ID starting with 3f367699e95dc9af599efa68f221269bc1955183049677fa0ff4ae3a7d895045 not found: ID does not exist" Oct 09 20:25:10 crc kubenswrapper[4907]: I1009 20:25:10.015520 4907 scope.go:117] "RemoveContainer" containerID="1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8" Oct 09 20:25:10 crc kubenswrapper[4907]: E1009 20:25:10.015861 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8\": container with ID starting with 1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8 not found: ID does not exist" containerID="1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8" Oct 09 20:25:10 crc kubenswrapper[4907]: I1009 20:25:10.015909 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8"} err="failed to get container status \"1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8\": rpc error: code = NotFound desc = could not find container \"1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8\": container with ID starting with 1f0db1aa87c23705ae9ac2ef7d10be6b785d06d274b3bf7f018dfb57c3aab7f8 not found: ID does not exist" Oct 09 20:25:10 crc kubenswrapper[4907]: I1009 20:25:10.015944 4907 scope.go:117] "RemoveContainer" containerID="15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1" Oct 09 20:25:10 crc kubenswrapper[4907]: E1009 20:25:10.016314 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1\": container with ID starting with 15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1 not found: ID does not exist" containerID="15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1" Oct 09 20:25:10 crc kubenswrapper[4907]: I1009 20:25:10.016339 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1"} err="failed to get container status \"15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1\": rpc error: code = NotFound desc = could not find container \"15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1\": container with ID starting with 15391fe14ffda848f60b512c6a362673e4e80afd2c88dc19c21d360719ad06e1 not found: ID does not exist" Oct 09 20:25:11 crc kubenswrapper[4907]: I1009 20:25:11.162819 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" path="/var/lib/kubelet/pods/818205e7-2675-44eb-a9fb-e7834ac224d3/volumes" Oct 09 20:25:25 crc kubenswrapper[4907]: I1009 20:25:25.046002 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-kvg9t"] Oct 09 20:25:25 crc kubenswrapper[4907]: I1009 20:25:25.059558 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-kvg9t"] Oct 09 20:25:25 crc kubenswrapper[4907]: I1009 20:25:25.171241 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb04018-212c-4798-85b3-85bd29b275d4" path="/var/lib/kubelet/pods/3cb04018-212c-4798-85b3-85bd29b275d4/volumes" Oct 09 20:25:30 crc kubenswrapper[4907]: I1009 20:25:30.031804 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-7c8tt"] Oct 09 20:25:30 crc kubenswrapper[4907]: I1009 20:25:30.041188 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-7c8tt"] Oct 09 20:25:31 crc kubenswrapper[4907]: I1009 20:25:31.172855 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f91323-f2d1-4c69-952a-453677ac2d28" path="/var/lib/kubelet/pods/69f91323-f2d1-4c69-952a-453677ac2d28/volumes" Oct 09 20:25:36 crc kubenswrapper[4907]: I1009 20:25:36.299568 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:25:36 crc kubenswrapper[4907]: I1009 20:25:36.300097 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:25:42 crc kubenswrapper[4907]: I1009 20:25:42.022721 4907 scope.go:117] "RemoveContainer" containerID="ec815aa94e48ee18c7125a8aef8d78e430f8636705aa00a71429bddf8acbe56e" Oct 09 20:25:42 crc kubenswrapper[4907]: I1009 20:25:42.051405 4907 scope.go:117] "RemoveContainer" containerID="d17ddfcaeb8c40752de4c617beae455229e669ef819725d7ea26a4bd1eb235e3" Oct 09 20:25:42 crc kubenswrapper[4907]: I1009 20:25:42.092542 4907 scope.go:117] "RemoveContainer" containerID="b421fc5ac8bf6a8a02506638d2dafbb7cea9187d96752319e5ae444604058805" Oct 09 20:25:42 crc kubenswrapper[4907]: I1009 20:25:42.193180 4907 scope.go:117] "RemoveContainer" containerID="d043bc39c30b5457c23cb75307fe268ac009c04692962fe7a73935859b4fc925" Oct 09 20:26:06 crc kubenswrapper[4907]: I1009 20:26:06.299093 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:26:06 crc kubenswrapper[4907]: I1009 20:26:06.299902 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:26:06 crc kubenswrapper[4907]: I1009 20:26:06.299985 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 20:26:06 crc kubenswrapper[4907]: I1009 20:26:06.301231 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96432ef9376f05f960921874ca8f70e980e9749884376496546f629d7367aada"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 20:26:06 crc kubenswrapper[4907]: I1009 20:26:06.301337 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://96432ef9376f05f960921874ca8f70e980e9749884376496546f629d7367aada" gracePeriod=600 Oct 09 20:26:06 crc kubenswrapper[4907]: I1009 20:26:06.552827 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="96432ef9376f05f960921874ca8f70e980e9749884376496546f629d7367aada" exitCode=0 Oct 09 20:26:06 crc kubenswrapper[4907]: I1009 20:26:06.552879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"96432ef9376f05f960921874ca8f70e980e9749884376496546f629d7367aada"} Oct 09 20:26:06 crc kubenswrapper[4907]: I1009 20:26:06.553215 4907 scope.go:117] "RemoveContainer" containerID="5fb37554146fd4cc88276358c61550267f7f0ddc45e9ed1012f784c15f271f4c" Oct 09 20:26:07 crc kubenswrapper[4907]: I1009 20:26:07.563241 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226"} Oct 09 20:26:08 crc kubenswrapper[4907]: I1009 20:26:08.577350 4907 generic.go:334] "Generic (PLEG): container finished" podID="1049514e-2b9c-426e-9534-677c595d39d8" containerID="34c9a5bdb7fbfd8cff047f6e916b82889894cc595f0a38632fb38591f8bb5666" exitCode=0 Oct 09 20:26:08 crc kubenswrapper[4907]: I1009 20:26:08.577447 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1049514e-2b9c-426e-9534-677c595d39d8","Type":"ContainerDied","Data":"34c9a5bdb7fbfd8cff047f6e916b82889894cc595f0a38632fb38591f8bb5666"} Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.206274 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295014 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config-secret\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295108 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ssh-key\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295155 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ca-certs\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295206 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295232 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-temporary\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295335 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-config-data\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295403 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295480 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zshrp\" (UniqueName: \"kubernetes.io/projected/1049514e-2b9c-426e-9534-677c595d39d8-kube-api-access-zshrp\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295501 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-workdir\") pod \"1049514e-2b9c-426e-9534-677c595d39d8\" (UID: \"1049514e-2b9c-426e-9534-677c595d39d8\") " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.295953 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.296253 4907 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.297302 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-config-data" (OuterVolumeSpecName: "config-data") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.301810 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.311398 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1049514e-2b9c-426e-9534-677c595d39d8-kube-api-access-zshrp" (OuterVolumeSpecName: "kube-api-access-zshrp") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "kube-api-access-zshrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.332758 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.333361 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.340079 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.369802 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.398289 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zshrp\" (UniqueName: \"kubernetes.io/projected/1049514e-2b9c-426e-9534-677c595d39d8-kube-api-access-zshrp\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.398329 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.398341 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.398351 4907 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1049514e-2b9c-426e-9534-677c595d39d8-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.398380 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.398395 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.398600 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1049514e-2b9c-426e-9534-677c595d39d8-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.422805 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.500424 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.602787 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1049514e-2b9c-426e-9534-677c595d39d8","Type":"ContainerDied","Data":"1a67fb7d6414ded3a60c4a0027e5e0a5cc07cf00da1a0f5974bf99dc2fa2c4e4"} Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.602824 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a67fb7d6414ded3a60c4a0027e5e0a5cc07cf00da1a0f5974bf99dc2fa2c4e4" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.602836 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.714768 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "1049514e-2b9c-426e-9534-677c595d39d8" (UID: "1049514e-2b9c-426e-9534-677c595d39d8"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:26:10 crc kubenswrapper[4907]: I1009 20:26:10.807636 4907 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1049514e-2b9c-426e-9534-677c595d39d8-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.774217 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 20:26:15 crc kubenswrapper[4907]: E1009 20:26:15.775361 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerName="extract-content" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.775377 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerName="extract-content" Oct 09 20:26:15 crc kubenswrapper[4907]: E1009 20:26:15.775409 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerName="registry-server" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.775417 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerName="registry-server" Oct 09 20:26:15 crc kubenswrapper[4907]: E1009 20:26:15.775437 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerName="extract-utilities" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.775445 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerName="extract-utilities" Oct 09 20:26:15 crc kubenswrapper[4907]: E1009 20:26:15.775484 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1049514e-2b9c-426e-9534-677c595d39d8" containerName="tempest-tests-tempest-tests-runner" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.775493 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1049514e-2b9c-426e-9534-677c595d39d8" containerName="tempest-tests-tempest-tests-runner" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.775728 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="818205e7-2675-44eb-a9fb-e7834ac224d3" containerName="registry-server" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.775748 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1049514e-2b9c-426e-9534-677c595d39d8" containerName="tempest-tests-tempest-tests-runner" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.776711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.779227 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-f86fh" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.784014 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.916334 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zss2d\" (UniqueName: \"kubernetes.io/projected/a5f99a22-52f0-4435-b5d3-a5bd2c675b50-kube-api-access-zss2d\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a5f99a22-52f0-4435-b5d3-a5bd2c675b50\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:15 crc kubenswrapper[4907]: I1009 20:26:15.916489 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a5f99a22-52f0-4435-b5d3-a5bd2c675b50\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:16 crc kubenswrapper[4907]: I1009 20:26:16.018247 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zss2d\" (UniqueName: \"kubernetes.io/projected/a5f99a22-52f0-4435-b5d3-a5bd2c675b50-kube-api-access-zss2d\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a5f99a22-52f0-4435-b5d3-a5bd2c675b50\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:16 crc kubenswrapper[4907]: I1009 20:26:16.018360 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a5f99a22-52f0-4435-b5d3-a5bd2c675b50\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:16 crc kubenswrapper[4907]: I1009 20:26:16.018850 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a5f99a22-52f0-4435-b5d3-a5bd2c675b50\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:16 crc kubenswrapper[4907]: I1009 20:26:16.041654 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zss2d\" (UniqueName: \"kubernetes.io/projected/a5f99a22-52f0-4435-b5d3-a5bd2c675b50-kube-api-access-zss2d\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a5f99a22-52f0-4435-b5d3-a5bd2c675b50\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:16 crc kubenswrapper[4907]: I1009 20:26:16.071793 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a5f99a22-52f0-4435-b5d3-a5bd2c675b50\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:16 crc kubenswrapper[4907]: I1009 20:26:16.099424 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 20:26:16 crc kubenswrapper[4907]: I1009 20:26:16.548161 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 20:26:16 crc kubenswrapper[4907]: I1009 20:26:16.673637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a5f99a22-52f0-4435-b5d3-a5bd2c675b50","Type":"ContainerStarted","Data":"c5ab2a69dc06424e14157cb8356956b257af1b01f1151959d087f87a27ddbbf9"} Oct 09 20:26:17 crc kubenswrapper[4907]: I1009 20:26:17.686905 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a5f99a22-52f0-4435-b5d3-a5bd2c675b50","Type":"ContainerStarted","Data":"ceb6aea32be49cb9431cdbca5f98f72547753a41897a4094dbf1f2475c0617cd"} Oct 09 20:26:17 crc kubenswrapper[4907]: I1009 20:26:17.706857 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.88614206 podStartE2EDuration="2.706835012s" podCreationTimestamp="2025-10-09 20:26:15 +0000 UTC" firstStartedPulling="2025-10-09 20:26:16.563700476 +0000 UTC m=+3462.095667965" lastFinishedPulling="2025-10-09 20:26:17.384393428 +0000 UTC m=+3462.916360917" observedRunningTime="2025-10-09 20:26:17.702288558 +0000 UTC m=+3463.234256067" watchObservedRunningTime="2025-10-09 20:26:17.706835012 +0000 UTC m=+3463.238802511" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.362972 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bbkhs"] Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.367421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.379679 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbkhs"] Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.477054 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-catalog-content\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.477131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t9jt\" (UniqueName: \"kubernetes.io/projected/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-kube-api-access-2t9jt\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.477477 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-utilities\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.579012 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-utilities\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.579192 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-catalog-content\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.579216 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t9jt\" (UniqueName: \"kubernetes.io/projected/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-kube-api-access-2t9jt\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.579521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-utilities\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.579994 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-catalog-content\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.600166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t9jt\" (UniqueName: \"kubernetes.io/projected/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-kube-api-access-2t9jt\") pod \"certified-operators-bbkhs\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:18 crc kubenswrapper[4907]: I1009 20:26:18.696003 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:19 crc kubenswrapper[4907]: I1009 20:26:19.228764 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbkhs"] Oct 09 20:26:19 crc kubenswrapper[4907]: W1009 20:26:19.235616 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2561ac8_b27d_4a3e_97dd_fb52f98d6d24.slice/crio-1fe537a6fa86b8bb57f1dc6806a310118a34c8931e284b55e99a7b423c47e06a WatchSource:0}: Error finding container 1fe537a6fa86b8bb57f1dc6806a310118a34c8931e284b55e99a7b423c47e06a: Status 404 returned error can't find the container with id 1fe537a6fa86b8bb57f1dc6806a310118a34c8931e284b55e99a7b423c47e06a Oct 09 20:26:19 crc kubenswrapper[4907]: I1009 20:26:19.710958 4907 generic.go:334] "Generic (PLEG): container finished" podID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerID="e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93" exitCode=0 Oct 09 20:26:19 crc kubenswrapper[4907]: I1009 20:26:19.711119 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbkhs" event={"ID":"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24","Type":"ContainerDied","Data":"e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93"} Oct 09 20:26:19 crc kubenswrapper[4907]: I1009 20:26:19.712995 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbkhs" event={"ID":"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24","Type":"ContainerStarted","Data":"1fe537a6fa86b8bb57f1dc6806a310118a34c8931e284b55e99a7b423c47e06a"} Oct 09 20:26:20 crc kubenswrapper[4907]: I1009 20:26:20.724209 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbkhs" event={"ID":"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24","Type":"ContainerStarted","Data":"e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb"} Oct 09 20:26:22 crc kubenswrapper[4907]: I1009 20:26:22.749203 4907 generic.go:334] "Generic (PLEG): container finished" podID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerID="e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb" exitCode=0 Oct 09 20:26:22 crc kubenswrapper[4907]: I1009 20:26:22.749365 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbkhs" event={"ID":"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24","Type":"ContainerDied","Data":"e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb"} Oct 09 20:26:23 crc kubenswrapper[4907]: I1009 20:26:23.761181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbkhs" event={"ID":"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24","Type":"ContainerStarted","Data":"b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519"} Oct 09 20:26:23 crc kubenswrapper[4907]: I1009 20:26:23.778989 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bbkhs" podStartSLOduration=2.286567483 podStartE2EDuration="5.77897202s" podCreationTimestamp="2025-10-09 20:26:18 +0000 UTC" firstStartedPulling="2025-10-09 20:26:19.712943424 +0000 UTC m=+3465.244910913" lastFinishedPulling="2025-10-09 20:26:23.205347961 +0000 UTC m=+3468.737315450" observedRunningTime="2025-10-09 20:26:23.777108004 +0000 UTC m=+3469.309075503" watchObservedRunningTime="2025-10-09 20:26:23.77897202 +0000 UTC m=+3469.310939509" Oct 09 20:26:28 crc kubenswrapper[4907]: I1009 20:26:28.696815 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:28 crc kubenswrapper[4907]: I1009 20:26:28.697408 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:28 crc kubenswrapper[4907]: I1009 20:26:28.755140 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:28 crc kubenswrapper[4907]: I1009 20:26:28.869747 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:28 crc kubenswrapper[4907]: I1009 20:26:28.996410 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbkhs"] Oct 09 20:26:30 crc kubenswrapper[4907]: I1009 20:26:30.833252 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bbkhs" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerName="registry-server" containerID="cri-o://b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519" gracePeriod=2 Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.550657 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.678345 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-utilities\") pod \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.678535 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t9jt\" (UniqueName: \"kubernetes.io/projected/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-kube-api-access-2t9jt\") pod \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.678708 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-catalog-content\") pod \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\" (UID: \"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24\") " Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.679821 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-utilities" (OuterVolumeSpecName: "utilities") pod "c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" (UID: "c2561ac8-b27d-4a3e-97dd-fb52f98d6d24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.687371 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-kube-api-access-2t9jt" (OuterVolumeSpecName: "kube-api-access-2t9jt") pod "c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" (UID: "c2561ac8-b27d-4a3e-97dd-fb52f98d6d24"). InnerVolumeSpecName "kube-api-access-2t9jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.725130 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" (UID: "c2561ac8-b27d-4a3e-97dd-fb52f98d6d24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.780894 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.780943 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.780962 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t9jt\" (UniqueName: \"kubernetes.io/projected/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24-kube-api-access-2t9jt\") on node \"crc\" DevicePath \"\"" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.844638 4907 generic.go:334] "Generic (PLEG): container finished" podID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerID="b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519" exitCode=0 Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.844711 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbkhs" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.844724 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbkhs" event={"ID":"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24","Type":"ContainerDied","Data":"b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519"} Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.845890 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbkhs" event={"ID":"c2561ac8-b27d-4a3e-97dd-fb52f98d6d24","Type":"ContainerDied","Data":"1fe537a6fa86b8bb57f1dc6806a310118a34c8931e284b55e99a7b423c47e06a"} Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.845913 4907 scope.go:117] "RemoveContainer" containerID="b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.865999 4907 scope.go:117] "RemoveContainer" containerID="e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.878430 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbkhs"] Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.888191 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bbkhs"] Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.907541 4907 scope.go:117] "RemoveContainer" containerID="e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.936691 4907 scope.go:117] "RemoveContainer" containerID="b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519" Oct 09 20:26:31 crc kubenswrapper[4907]: E1009 20:26:31.943581 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519\": container with ID starting with b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519 not found: ID does not exist" containerID="b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.943651 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519"} err="failed to get container status \"b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519\": rpc error: code = NotFound desc = could not find container \"b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519\": container with ID starting with b66fd9553c1511ba2dcf23a8922dd30df52bb9ef8ac7cab2ebd8d0b50724c519 not found: ID does not exist" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.943685 4907 scope.go:117] "RemoveContainer" containerID="e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb" Oct 09 20:26:31 crc kubenswrapper[4907]: E1009 20:26:31.945570 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb\": container with ID starting with e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb not found: ID does not exist" containerID="e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.945728 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb"} err="failed to get container status \"e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb\": rpc error: code = NotFound desc = could not find container \"e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb\": container with ID starting with e86b4250a9724f46b36b14b3dcc63e355a547635ac0d29a4abee172d9da1cdeb not found: ID does not exist" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.945812 4907 scope.go:117] "RemoveContainer" containerID="e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93" Oct 09 20:26:31 crc kubenswrapper[4907]: E1009 20:26:31.946819 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93\": container with ID starting with e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93 not found: ID does not exist" containerID="e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93" Oct 09 20:26:31 crc kubenswrapper[4907]: I1009 20:26:31.946866 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93"} err="failed to get container status \"e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93\": rpc error: code = NotFound desc = could not find container \"e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93\": container with ID starting with e2807e4d1ea984233bb7363d99e6ecd4e3945220327607231453f00a9bc14b93 not found: ID does not exist" Oct 09 20:26:33 crc kubenswrapper[4907]: I1009 20:26:33.188218 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" path="/var/lib/kubelet/pods/c2561ac8-b27d-4a3e-97dd-fb52f98d6d24/volumes" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.811665 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cqxd5/must-gather-rzcbv"] Oct 09 20:26:35 crc kubenswrapper[4907]: E1009 20:26:35.812066 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerName="registry-server" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.812078 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerName="registry-server" Oct 09 20:26:35 crc kubenswrapper[4907]: E1009 20:26:35.812093 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerName="extract-content" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.812099 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerName="extract-content" Oct 09 20:26:35 crc kubenswrapper[4907]: E1009 20:26:35.812111 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerName="extract-utilities" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.812117 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerName="extract-utilities" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.812391 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2561ac8-b27d-4a3e-97dd-fb52f98d6d24" containerName="registry-server" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.815857 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.818125 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cqxd5"/"openshift-service-ca.crt" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.818176 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cqxd5"/"kube-root-ca.crt" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.818128 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-cqxd5"/"default-dockercfg-k9tds" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.829895 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cqxd5/must-gather-rzcbv"] Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.912130 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj9f4\" (UniqueName: \"kubernetes.io/projected/54ed3cdb-9565-4544-90bb-13b7d3088dc3-kube-api-access-qj9f4\") pod \"must-gather-rzcbv\" (UID: \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\") " pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:26:35 crc kubenswrapper[4907]: I1009 20:26:35.912184 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54ed3cdb-9565-4544-90bb-13b7d3088dc3-must-gather-output\") pod \"must-gather-rzcbv\" (UID: \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\") " pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:26:36 crc kubenswrapper[4907]: I1009 20:26:36.013745 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj9f4\" (UniqueName: \"kubernetes.io/projected/54ed3cdb-9565-4544-90bb-13b7d3088dc3-kube-api-access-qj9f4\") pod \"must-gather-rzcbv\" (UID: \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\") " pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:26:36 crc kubenswrapper[4907]: I1009 20:26:36.014119 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54ed3cdb-9565-4544-90bb-13b7d3088dc3-must-gather-output\") pod \"must-gather-rzcbv\" (UID: \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\") " pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:26:36 crc kubenswrapper[4907]: I1009 20:26:36.014449 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54ed3cdb-9565-4544-90bb-13b7d3088dc3-must-gather-output\") pod \"must-gather-rzcbv\" (UID: \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\") " pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:26:36 crc kubenswrapper[4907]: I1009 20:26:36.039064 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj9f4\" (UniqueName: \"kubernetes.io/projected/54ed3cdb-9565-4544-90bb-13b7d3088dc3-kube-api-access-qj9f4\") pod \"must-gather-rzcbv\" (UID: \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\") " pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:26:36 crc kubenswrapper[4907]: I1009 20:26:36.139743 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:26:36 crc kubenswrapper[4907]: I1009 20:26:36.631986 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cqxd5/must-gather-rzcbv"] Oct 09 20:26:36 crc kubenswrapper[4907]: W1009 20:26:36.634662 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54ed3cdb_9565_4544_90bb_13b7d3088dc3.slice/crio-7abdbe73228db98085c6108cb64e791d5972e7cc678f41c9d2a6dfd4045e0d86 WatchSource:0}: Error finding container 7abdbe73228db98085c6108cb64e791d5972e7cc678f41c9d2a6dfd4045e0d86: Status 404 returned error can't find the container with id 7abdbe73228db98085c6108cb64e791d5972e7cc678f41c9d2a6dfd4045e0d86 Oct 09 20:26:36 crc kubenswrapper[4907]: I1009 20:26:36.956642 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" event={"ID":"54ed3cdb-9565-4544-90bb-13b7d3088dc3","Type":"ContainerStarted","Data":"7abdbe73228db98085c6108cb64e791d5972e7cc678f41c9d2a6dfd4045e0d86"} Oct 09 20:26:42 crc kubenswrapper[4907]: I1009 20:26:42.006485 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" event={"ID":"54ed3cdb-9565-4544-90bb-13b7d3088dc3","Type":"ContainerStarted","Data":"81c014e53e1281464ef861e52cf1aec5dd96eea91ad041dba9aa4d5aed27c523"} Oct 09 20:26:42 crc kubenswrapper[4907]: I1009 20:26:42.007100 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" event={"ID":"54ed3cdb-9565-4544-90bb-13b7d3088dc3","Type":"ContainerStarted","Data":"250fa350d6555d9d1e98f788e8c9509dcf7f1cf6c2e5d3761241515fd83e403e"} Oct 09 20:26:42 crc kubenswrapper[4907]: I1009 20:26:42.026654 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" podStartSLOduration=2.937976858 podStartE2EDuration="7.026631099s" podCreationTimestamp="2025-10-09 20:26:35 +0000 UTC" firstStartedPulling="2025-10-09 20:26:36.636066022 +0000 UTC m=+3482.168033511" lastFinishedPulling="2025-10-09 20:26:40.724720263 +0000 UTC m=+3486.256687752" observedRunningTime="2025-10-09 20:26:42.024093296 +0000 UTC m=+3487.556060795" watchObservedRunningTime="2025-10-09 20:26:42.026631099 +0000 UTC m=+3487.558598588" Oct 09 20:26:43 crc kubenswrapper[4907]: E1009 20:26:43.776224 4907 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.104:41566->38.102.83.104:46823: write tcp 38.102.83.104:41566->38.102.83.104:46823: write: broken pipe Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.488595 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-k2vkw"] Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.490765 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.608750 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c28c8db5-323e-42b0-ba87-382e3ef236bd-host\") pod \"crc-debug-k2vkw\" (UID: \"c28c8db5-323e-42b0-ba87-382e3ef236bd\") " pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.608985 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrq5\" (UniqueName: \"kubernetes.io/projected/c28c8db5-323e-42b0-ba87-382e3ef236bd-kube-api-access-vkrq5\") pod \"crc-debug-k2vkw\" (UID: \"c28c8db5-323e-42b0-ba87-382e3ef236bd\") " pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.710901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c28c8db5-323e-42b0-ba87-382e3ef236bd-host\") pod \"crc-debug-k2vkw\" (UID: \"c28c8db5-323e-42b0-ba87-382e3ef236bd\") " pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.711027 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrq5\" (UniqueName: \"kubernetes.io/projected/c28c8db5-323e-42b0-ba87-382e3ef236bd-kube-api-access-vkrq5\") pod \"crc-debug-k2vkw\" (UID: \"c28c8db5-323e-42b0-ba87-382e3ef236bd\") " pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.711636 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c28c8db5-323e-42b0-ba87-382e3ef236bd-host\") pod \"crc-debug-k2vkw\" (UID: \"c28c8db5-323e-42b0-ba87-382e3ef236bd\") " pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.736831 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrq5\" (UniqueName: \"kubernetes.io/projected/c28c8db5-323e-42b0-ba87-382e3ef236bd-kube-api-access-vkrq5\") pod \"crc-debug-k2vkw\" (UID: \"c28c8db5-323e-42b0-ba87-382e3ef236bd\") " pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:26:44 crc kubenswrapper[4907]: I1009 20:26:44.814509 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:26:44 crc kubenswrapper[4907]: W1009 20:26:44.881934 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc28c8db5_323e_42b0_ba87_382e3ef236bd.slice/crio-e464d53b196cac8f8eb58aba638b0aec1bdfc316d950963afe11770730c4dc35 WatchSource:0}: Error finding container e464d53b196cac8f8eb58aba638b0aec1bdfc316d950963afe11770730c4dc35: Status 404 returned error can't find the container with id e464d53b196cac8f8eb58aba638b0aec1bdfc316d950963afe11770730c4dc35 Oct 09 20:26:45 crc kubenswrapper[4907]: I1009 20:26:45.043378 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" event={"ID":"c28c8db5-323e-42b0-ba87-382e3ef236bd","Type":"ContainerStarted","Data":"e464d53b196cac8f8eb58aba638b0aec1bdfc316d950963afe11770730c4dc35"} Oct 09 20:26:46 crc kubenswrapper[4907]: E1009 20:26:46.460322 4907 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.104:41636->38.102.83.104:46823: read tcp 38.102.83.104:41636->38.102.83.104:46823: read: connection reset by peer Oct 09 20:26:58 crc kubenswrapper[4907]: I1009 20:26:58.178422 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" event={"ID":"c28c8db5-323e-42b0-ba87-382e3ef236bd","Type":"ContainerStarted","Data":"257078e1e4b33fbdb782fb89a256fe94d37bcc73125b7c72814d2eedcdc0e762"} Oct 09 20:26:58 crc kubenswrapper[4907]: I1009 20:26:58.204545 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" podStartSLOduration=2.176740063 podStartE2EDuration="14.204521736s" podCreationTimestamp="2025-10-09 20:26:44 +0000 UTC" firstStartedPulling="2025-10-09 20:26:44.884792575 +0000 UTC m=+3490.416760064" lastFinishedPulling="2025-10-09 20:26:56.912574248 +0000 UTC m=+3502.444541737" observedRunningTime="2025-10-09 20:26:58.197657605 +0000 UTC m=+3503.729625104" watchObservedRunningTime="2025-10-09 20:26:58.204521736 +0000 UTC m=+3503.736489235" Oct 09 20:27:38 crc kubenswrapper[4907]: I1009 20:27:38.586093 4907 generic.go:334] "Generic (PLEG): container finished" podID="c28c8db5-323e-42b0-ba87-382e3ef236bd" containerID="257078e1e4b33fbdb782fb89a256fe94d37bcc73125b7c72814d2eedcdc0e762" exitCode=0 Oct 09 20:27:38 crc kubenswrapper[4907]: I1009 20:27:38.586181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" event={"ID":"c28c8db5-323e-42b0-ba87-382e3ef236bd","Type":"ContainerDied","Data":"257078e1e4b33fbdb782fb89a256fe94d37bcc73125b7c72814d2eedcdc0e762"} Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.728996 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.764235 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-k2vkw"] Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.771877 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkrq5\" (UniqueName: \"kubernetes.io/projected/c28c8db5-323e-42b0-ba87-382e3ef236bd-kube-api-access-vkrq5\") pod \"c28c8db5-323e-42b0-ba87-382e3ef236bd\" (UID: \"c28c8db5-323e-42b0-ba87-382e3ef236bd\") " Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.771943 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c28c8db5-323e-42b0-ba87-382e3ef236bd-host\") pod \"c28c8db5-323e-42b0-ba87-382e3ef236bd\" (UID: \"c28c8db5-323e-42b0-ba87-382e3ef236bd\") " Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.772104 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c28c8db5-323e-42b0-ba87-382e3ef236bd-host" (OuterVolumeSpecName: "host") pod "c28c8db5-323e-42b0-ba87-382e3ef236bd" (UID: "c28c8db5-323e-42b0-ba87-382e3ef236bd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.772669 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c28c8db5-323e-42b0-ba87-382e3ef236bd-host\") on node \"crc\" DevicePath \"\"" Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.775001 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-k2vkw"] Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.793332 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28c8db5-323e-42b0-ba87-382e3ef236bd-kube-api-access-vkrq5" (OuterVolumeSpecName: "kube-api-access-vkrq5") pod "c28c8db5-323e-42b0-ba87-382e3ef236bd" (UID: "c28c8db5-323e-42b0-ba87-382e3ef236bd"). InnerVolumeSpecName "kube-api-access-vkrq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:27:39 crc kubenswrapper[4907]: I1009 20:27:39.874711 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkrq5\" (UniqueName: \"kubernetes.io/projected/c28c8db5-323e-42b0-ba87-382e3ef236bd-kube-api-access-vkrq5\") on node \"crc\" DevicePath \"\"" Oct 09 20:27:40 crc kubenswrapper[4907]: I1009 20:27:40.607894 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e464d53b196cac8f8eb58aba638b0aec1bdfc316d950963afe11770730c4dc35" Oct 09 20:27:40 crc kubenswrapper[4907]: I1009 20:27:40.608498 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-k2vkw" Oct 09 20:27:40 crc kubenswrapper[4907]: I1009 20:27:40.975013 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-v7zdw"] Oct 09 20:27:40 crc kubenswrapper[4907]: E1009 20:27:40.975405 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28c8db5-323e-42b0-ba87-382e3ef236bd" containerName="container-00" Oct 09 20:27:40 crc kubenswrapper[4907]: I1009 20:27:40.975418 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28c8db5-323e-42b0-ba87-382e3ef236bd" containerName="container-00" Oct 09 20:27:40 crc kubenswrapper[4907]: I1009 20:27:40.975650 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28c8db5-323e-42b0-ba87-382e3ef236bd" containerName="container-00" Oct 09 20:27:40 crc kubenswrapper[4907]: I1009 20:27:40.976356 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.101938 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv4wb\" (UniqueName: \"kubernetes.io/projected/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-kube-api-access-zv4wb\") pod \"crc-debug-v7zdw\" (UID: \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\") " pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.102358 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-host\") pod \"crc-debug-v7zdw\" (UID: \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\") " pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.162539 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28c8db5-323e-42b0-ba87-382e3ef236bd" path="/var/lib/kubelet/pods/c28c8db5-323e-42b0-ba87-382e3ef236bd/volumes" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.204869 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-host\") pod \"crc-debug-v7zdw\" (UID: \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\") " pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.204976 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv4wb\" (UniqueName: \"kubernetes.io/projected/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-kube-api-access-zv4wb\") pod \"crc-debug-v7zdw\" (UID: \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\") " pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.205050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-host\") pod \"crc-debug-v7zdw\" (UID: \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\") " pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.226203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv4wb\" (UniqueName: \"kubernetes.io/projected/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-kube-api-access-zv4wb\") pod \"crc-debug-v7zdw\" (UID: \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\") " pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.294429 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.617656 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" event={"ID":"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713","Type":"ContainerStarted","Data":"193b596a7344af104290af45e72f7f48b65caa28f7649fe09b5fdc3b53b1e6d3"} Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.617696 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" event={"ID":"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713","Type":"ContainerStarted","Data":"93dbc0179a584e95ed1edee80298264ba43d117536e7f854a33a32749b1b25cd"} Oct 09 20:27:41 crc kubenswrapper[4907]: I1009 20:27:41.642325 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" podStartSLOduration=1.6423063409999998 podStartE2EDuration="1.642306341s" podCreationTimestamp="2025-10-09 20:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 20:27:41.640869755 +0000 UTC m=+3547.172837254" watchObservedRunningTime="2025-10-09 20:27:41.642306341 +0000 UTC m=+3547.174273830" Oct 09 20:27:42 crc kubenswrapper[4907]: I1009 20:27:42.627263 4907 generic.go:334] "Generic (PLEG): container finished" podID="ff5fcf7c-5015-4b8d-a0b4-dc39a3257713" containerID="193b596a7344af104290af45e72f7f48b65caa28f7649fe09b5fdc3b53b1e6d3" exitCode=0 Oct 09 20:27:42 crc kubenswrapper[4907]: I1009 20:27:42.627533 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" event={"ID":"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713","Type":"ContainerDied","Data":"193b596a7344af104290af45e72f7f48b65caa28f7649fe09b5fdc3b53b1e6d3"} Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.757608 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.789232 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-v7zdw"] Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.799389 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-v7zdw"] Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.860110 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv4wb\" (UniqueName: \"kubernetes.io/projected/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-kube-api-access-zv4wb\") pod \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\" (UID: \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\") " Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.860307 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-host\") pod \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\" (UID: \"ff5fcf7c-5015-4b8d-a0b4-dc39a3257713\") " Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.860443 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-host" (OuterVolumeSpecName: "host") pod "ff5fcf7c-5015-4b8d-a0b4-dc39a3257713" (UID: "ff5fcf7c-5015-4b8d-a0b4-dc39a3257713"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.860781 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-host\") on node \"crc\" DevicePath \"\"" Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.878441 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-kube-api-access-zv4wb" (OuterVolumeSpecName: "kube-api-access-zv4wb") pod "ff5fcf7c-5015-4b8d-a0b4-dc39a3257713" (UID: "ff5fcf7c-5015-4b8d-a0b4-dc39a3257713"). InnerVolumeSpecName "kube-api-access-zv4wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:27:43 crc kubenswrapper[4907]: I1009 20:27:43.962425 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv4wb\" (UniqueName: \"kubernetes.io/projected/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713-kube-api-access-zv4wb\") on node \"crc\" DevicePath \"\"" Oct 09 20:27:44 crc kubenswrapper[4907]: I1009 20:27:44.647451 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93dbc0179a584e95ed1edee80298264ba43d117536e7f854a33a32749b1b25cd" Oct 09 20:27:44 crc kubenswrapper[4907]: I1009 20:27:44.647549 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-v7zdw" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.014615 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-sxgjm"] Oct 09 20:27:45 crc kubenswrapper[4907]: E1009 20:27:45.015041 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5fcf7c-5015-4b8d-a0b4-dc39a3257713" containerName="container-00" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.015052 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5fcf7c-5015-4b8d-a0b4-dc39a3257713" containerName="container-00" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.015277 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5fcf7c-5015-4b8d-a0b4-dc39a3257713" containerName="container-00" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.016116 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.083099 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-host\") pod \"crc-debug-sxgjm\" (UID: \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\") " pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.083443 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4g65\" (UniqueName: \"kubernetes.io/projected/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-kube-api-access-q4g65\") pod \"crc-debug-sxgjm\" (UID: \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\") " pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.171074 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5fcf7c-5015-4b8d-a0b4-dc39a3257713" path="/var/lib/kubelet/pods/ff5fcf7c-5015-4b8d-a0b4-dc39a3257713/volumes" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.185435 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-host\") pod \"crc-debug-sxgjm\" (UID: \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\") " pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.185578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4g65\" (UniqueName: \"kubernetes.io/projected/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-kube-api-access-q4g65\") pod \"crc-debug-sxgjm\" (UID: \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\") " pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.186068 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-host\") pod \"crc-debug-sxgjm\" (UID: \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\") " pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.206823 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4g65\" (UniqueName: \"kubernetes.io/projected/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-kube-api-access-q4g65\") pod \"crc-debug-sxgjm\" (UID: \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\") " pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.333443 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:45 crc kubenswrapper[4907]: W1009 20:27:45.359195 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bbdf113_2bb8_478e_9ebc_8e9838f7ade0.slice/crio-81a7274f3480eab1567068090f1299e9a01eb56bcb30a7189a56c063200b0a6d WatchSource:0}: Error finding container 81a7274f3480eab1567068090f1299e9a01eb56bcb30a7189a56c063200b0a6d: Status 404 returned error can't find the container with id 81a7274f3480eab1567068090f1299e9a01eb56bcb30a7189a56c063200b0a6d Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.662747 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" event={"ID":"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0","Type":"ContainerStarted","Data":"529832cc3665c49f09b90e5ab052ef0b89f56833d6edd6c5bbc03e59e0c420f5"} Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.662987 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" event={"ID":"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0","Type":"ContainerStarted","Data":"81a7274f3480eab1567068090f1299e9a01eb56bcb30a7189a56c063200b0a6d"} Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.732416 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-sxgjm"] Oct 09 20:27:45 crc kubenswrapper[4907]: I1009 20:27:45.739975 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cqxd5/crc-debug-sxgjm"] Oct 09 20:27:46 crc kubenswrapper[4907]: I1009 20:27:46.673877 4907 generic.go:334] "Generic (PLEG): container finished" podID="3bbdf113-2bb8-478e-9ebc-8e9838f7ade0" containerID="529832cc3665c49f09b90e5ab052ef0b89f56833d6edd6c5bbc03e59e0c420f5" exitCode=0 Oct 09 20:27:46 crc kubenswrapper[4907]: I1009 20:27:46.793231 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:46 crc kubenswrapper[4907]: I1009 20:27:46.923960 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-host\") pod \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\" (UID: \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\") " Oct 09 20:27:46 crc kubenswrapper[4907]: I1009 20:27:46.924067 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-host" (OuterVolumeSpecName: "host") pod "3bbdf113-2bb8-478e-9ebc-8e9838f7ade0" (UID: "3bbdf113-2bb8-478e-9ebc-8e9838f7ade0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 20:27:46 crc kubenswrapper[4907]: I1009 20:27:46.924209 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4g65\" (UniqueName: \"kubernetes.io/projected/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-kube-api-access-q4g65\") pod \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\" (UID: \"3bbdf113-2bb8-478e-9ebc-8e9838f7ade0\") " Oct 09 20:27:46 crc kubenswrapper[4907]: I1009 20:27:46.924759 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-host\") on node \"crc\" DevicePath \"\"" Oct 09 20:27:46 crc kubenswrapper[4907]: I1009 20:27:46.930687 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-kube-api-access-q4g65" (OuterVolumeSpecName: "kube-api-access-q4g65") pod "3bbdf113-2bb8-478e-9ebc-8e9838f7ade0" (UID: "3bbdf113-2bb8-478e-9ebc-8e9838f7ade0"). InnerVolumeSpecName "kube-api-access-q4g65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:27:47 crc kubenswrapper[4907]: I1009 20:27:47.026993 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4g65\" (UniqueName: \"kubernetes.io/projected/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0-kube-api-access-q4g65\") on node \"crc\" DevicePath \"\"" Oct 09 20:27:47 crc kubenswrapper[4907]: I1009 20:27:47.168911 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bbdf113-2bb8-478e-9ebc-8e9838f7ade0" path="/var/lib/kubelet/pods/3bbdf113-2bb8-478e-9ebc-8e9838f7ade0/volumes" Oct 09 20:27:47 crc kubenswrapper[4907]: I1009 20:27:47.684974 4907 scope.go:117] "RemoveContainer" containerID="529832cc3665c49f09b90e5ab052ef0b89f56833d6edd6c5bbc03e59e0c420f5" Oct 09 20:27:47 crc kubenswrapper[4907]: I1009 20:27:47.684995 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/crc-debug-sxgjm" Oct 09 20:27:54 crc kubenswrapper[4907]: I1009 20:27:54.677880 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee/init-config-reloader/0.log" Oct 09 20:27:54 crc kubenswrapper[4907]: I1009 20:27:54.840851 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee/alertmanager/0.log" Oct 09 20:27:54 crc kubenswrapper[4907]: I1009 20:27:54.857825 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee/config-reloader/0.log" Oct 09 20:27:54 crc kubenswrapper[4907]: I1009 20:27:54.891088 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee/init-config-reloader/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.039636 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansible-tests-cloudkitty-s00-cloudkitty_92e389b9-a749-4f7e-9c0a-3c901329ff51/ansible-tests-cloudkitty/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.111652 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbc76cff8-xmlsw_f9a93848-dc0b-480e-9ec9-fc16c88e00dc/barbican-api/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.185664 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbc76cff8-xmlsw_f9a93848-dc0b-480e-9ec9-fc16c88e00dc/barbican-api-log/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.410434 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7484f7b746-btdlm_287de68c-1c57-4f07-ba04-4d0899b26673/barbican-keystone-listener/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.452787 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7484f7b746-btdlm_287de68c-1c57-4f07-ba04-4d0899b26673/barbican-keystone-listener-log/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.595708 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5db85b5857-58t94_856ed5f9-dc9b-43db-9d28-b1e400d25798/barbican-worker/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.608978 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5db85b5857-58t94_856ed5f9-dc9b-43db-9d28-b1e400d25798/barbican-worker-log/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.734318 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt_0fc7b172-694d-4880-a68f-15ba2460d816/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:27:55 crc kubenswrapper[4907]: I1009 20:27:55.898719 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_38c4b2fd-cdaf-4217-a934-3210e6eb4f80/ceilometer-central-agent/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.119127 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_38c4b2fd-cdaf-4217-a934-3210e6eb4f80/proxy-httpd/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.133437 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_38c4b2fd-cdaf-4217-a934-3210e6eb4f80/ceilometer-notification-agent/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.217691 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_38c4b2fd-cdaf-4217-a934-3210e6eb4f80/sg-core/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.346911 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_727d6b58-cf41-4fe0-bf13-5a5a82fe2747/cinder-api/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.377673 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_727d6b58-cf41-4fe0-bf13-5a5a82fe2747/cinder-api-log/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.548037 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dc694291-2c4e-4bdf-b00c-4025d2018e96/probe/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.584058 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dc694291-2c4e-4bdf-b00c-4025d2018e96/cinder-scheduler/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.758778 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9/cloudkitty-api/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.809388 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9/cloudkitty-api-log/0.log" Oct 09 20:27:56 crc kubenswrapper[4907]: I1009 20:27:56.840236 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_246ca210-2f65-4612-a7ac-dc4e206dd6f0/loki-compactor/0.log" Oct 09 20:27:57 crc kubenswrapper[4907]: I1009 20:27:57.030864 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-56cd74f89f-gl9k6_be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3/loki-distributor/0.log" Oct 09 20:27:57 crc kubenswrapper[4907]: I1009 20:27:57.081899 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-d5l99_d03409bd-dfae-4397-bd24-55c925ce4d25/gateway/0.log" Oct 09 20:27:57 crc kubenswrapper[4907]: I1009 20:27:57.267306 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-n4njd_dc87ab00-6151-4b9a-828b-b7fab2987f4e/gateway/0.log" Oct 09 20:27:57 crc kubenswrapper[4907]: I1009 20:27:57.324776 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_2f1d45fa-edcc-4ab0-a435-26fce79f5607/loki-index-gateway/0.log" Oct 09 20:27:57 crc kubenswrapper[4907]: I1009 20:27:57.491633 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0/loki-ingester/0.log" Oct 09 20:27:57 crc kubenswrapper[4907]: I1009 20:27:57.520867 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-68bbd7984c-gtcgb_8591c06c-4ad0-41b1-b62f-ea21f97f50a4/loki-querier/0.log" Oct 09 20:27:57 crc kubenswrapper[4907]: I1009 20:27:57.713058 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-779849886d-s66fd_11282a94-310a-44d0-8edd-8a49d8050096/loki-query-frontend/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.018038 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx_4f6c717a-ca37-4879-babe-36221d9580fa/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.222932 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s_f92986eb-ccf3-4d54-a1e6-5f4168a4bab9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.404448 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-w65sw_25257dc2-dcd5-4771-b24d-94e98cd6d8a1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.429528 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_4c613c00-9feb-4432-9d03-b980178cbe26/cloudkitty-proc/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.466991 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gbnql_1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e/init/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.677958 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gbnql_1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e/dnsmasq-dns/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.737976 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gbnql_1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e/init/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.772371 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf_5594394b-d72c-4541-ba69-6342110d2b3a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.901382 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ce1c21df-d6fe-46f5-b959-8c720f7b4fcb/glance-httpd/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.912743 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ce1c21df-d6fe-46f5-b959-8c720f7b4fcb/glance-log/0.log" Oct 09 20:27:58 crc kubenswrapper[4907]: I1009 20:27:58.988219 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6bc9d4bf-354f-45c7-8116-6119f3f78b0c/glance-httpd/0.log" Oct 09 20:27:59 crc kubenswrapper[4907]: I1009 20:27:59.042115 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6bc9d4bf-354f-45c7-8116-6119f3f78b0c/glance-log/0.log" Oct 09 20:27:59 crc kubenswrapper[4907]: I1009 20:27:59.156872 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-rf68v_77c2e1e4-f07b-4a72-b68d-661856abd621/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:27:59 crc kubenswrapper[4907]: I1009 20:27:59.462543 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pqds4_3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:27:59 crc kubenswrapper[4907]: I1009 20:27:59.632807 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29334001-d7wk2_e44476e3-d4b7-4c73-a478-9860df9f1d22/keystone-cron/0.log" Oct 09 20:27:59 crc kubenswrapper[4907]: I1009 20:27:59.746823 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6_313461ee-e16e-42e8-97ef-5e2d16f23cb5/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:27:59 crc kubenswrapper[4907]: I1009 20:27:59.777520 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-799f6b8dfc-hssd9_c720ed9b-76d3-441e-a0b6-81170e63f46f/keystone-api/0.log" Oct 09 20:28:00 crc kubenswrapper[4907]: I1009 20:28:00.082692 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68f5dff589-gkl29_23247623-e419-4c41-a5dd-1ec60cdc8ccd/neutron-httpd/0.log" Oct 09 20:28:00 crc kubenswrapper[4907]: I1009 20:28:00.145771 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68f5dff589-gkl29_23247623-e419-4c41-a5dd-1ec60cdc8ccd/neutron-api/0.log" Oct 09 20:28:00 crc kubenswrapper[4907]: I1009 20:28:00.181930 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp_1830993a-457e-4730-a805-fa14152f2824/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:00 crc kubenswrapper[4907]: I1009 20:28:00.693767 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_83277287-28b0-43e3-98e7-e8367e7a87d9/nova-api-log/0.log" Oct 09 20:28:00 crc kubenswrapper[4907]: I1009 20:28:00.715165 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_92173588-8d80-440e-9dd4-62b132d5abed/nova-cell0-conductor-conductor/0.log" Oct 09 20:28:00 crc kubenswrapper[4907]: I1009 20:28:00.994882 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_83277287-28b0-43e3-98e7-e8367e7a87d9/nova-api-api/0.log" Oct 09 20:28:01 crc kubenswrapper[4907]: I1009 20:28:01.055588 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_01275002-ecaa-441e-b1a1-035dd770cb1d/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 20:28:01 crc kubenswrapper[4907]: I1009 20:28:01.082730 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3dbc636b-ea8b-4e61-bce8-2d6aaae5d855/nova-cell1-conductor-conductor/0.log" Oct 09 20:28:01 crc kubenswrapper[4907]: I1009 20:28:01.254570 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-mj6zm_cf1c00bc-7815-4bf7-8c42-d85c38936b4b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:01 crc kubenswrapper[4907]: I1009 20:28:01.357297 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_33bc0e25-b33d-4af0-b735-cac7deff34eb/nova-metadata-log/0.log" Oct 09 20:28:01 crc kubenswrapper[4907]: I1009 20:28:01.703294 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f087483f-1925-46d1-a58d-c7cf2354fbb1/nova-scheduler-scheduler/0.log" Oct 09 20:28:01 crc kubenswrapper[4907]: I1009 20:28:01.767690 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92026239-8122-4224-ae55-be69f2c42a77/mysql-bootstrap/0.log" Oct 09 20:28:01 crc kubenswrapper[4907]: I1009 20:28:01.943453 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92026239-8122-4224-ae55-be69f2c42a77/mysql-bootstrap/0.log" Oct 09 20:28:01 crc kubenswrapper[4907]: I1009 20:28:01.974452 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92026239-8122-4224-ae55-be69f2c42a77/galera/0.log" Oct 09 20:28:02 crc kubenswrapper[4907]: I1009 20:28:02.148808 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_892437ea-977d-434a-ba03-2ce726fb21b0/mysql-bootstrap/0.log" Oct 09 20:28:02 crc kubenswrapper[4907]: I1009 20:28:02.374754 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_892437ea-977d-434a-ba03-2ce726fb21b0/mysql-bootstrap/0.log" Oct 09 20:28:02 crc kubenswrapper[4907]: I1009 20:28:02.389301 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_892437ea-977d-434a-ba03-2ce726fb21b0/galera/0.log" Oct 09 20:28:02 crc kubenswrapper[4907]: I1009 20:28:02.573696 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_cd3f8f7d-d0f5-4719-a490-d823cf3c8b23/openstackclient/0.log" Oct 09 20:28:02 crc kubenswrapper[4907]: I1009 20:28:02.692999 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_33bc0e25-b33d-4af0-b735-cac7deff34eb/nova-metadata-metadata/0.log" Oct 09 20:28:02 crc kubenswrapper[4907]: I1009 20:28:02.769966 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dz7f2_c9bf7943-cd49-4a26-83e2-9efc4c9dcc02/ovn-controller/0.log" Oct 09 20:28:02 crc kubenswrapper[4907]: I1009 20:28:02.903188 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fhw7x_ed4e1f67-4cb8-4d46-823b-fb81e37d63c1/openstack-network-exporter/0.log" Oct 09 20:28:02 crc kubenswrapper[4907]: I1009 20:28:02.960947 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9z259_c0f67e81-b9c9-419e-bc68-dcc44ac15f4d/ovsdb-server-init/0.log" Oct 09 20:28:03 crc kubenswrapper[4907]: I1009 20:28:03.240104 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9z259_c0f67e81-b9c9-419e-bc68-dcc44ac15f4d/ovsdb-server/0.log" Oct 09 20:28:03 crc kubenswrapper[4907]: I1009 20:28:03.243194 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9z259_c0f67e81-b9c9-419e-bc68-dcc44ac15f4d/ovs-vswitchd/0.log" Oct 09 20:28:03 crc kubenswrapper[4907]: I1009 20:28:03.251855 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9z259_c0f67e81-b9c9-419e-bc68-dcc44ac15f4d/ovsdb-server-init/0.log" Oct 09 20:28:03 crc kubenswrapper[4907]: I1009 20:28:03.490134 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_97065ead-95b6-46d3-ab20-a073f6b5f243/openstack-network-exporter/0.log" Oct 09 20:28:03 crc kubenswrapper[4907]: I1009 20:28:03.509383 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nxtwk_6a6607e9-2440-4d12-8649-28e484f86815/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:03 crc kubenswrapper[4907]: I1009 20:28:03.573886 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_97065ead-95b6-46d3-ab20-a073f6b5f243/ovn-northd/0.log" Oct 09 20:28:03 crc kubenswrapper[4907]: I1009 20:28:03.783924 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9c18daae-d993-4ac8-954d-f8c38cacedd1/openstack-network-exporter/0.log" Oct 09 20:28:03 crc kubenswrapper[4907]: I1009 20:28:03.793725 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9c18daae-d993-4ac8-954d-f8c38cacedd1/ovsdbserver-nb/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.000593 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f1723ff9-c43f-463d-903c-11f9b38519e2/openstack-network-exporter/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.006862 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f1723ff9-c43f-463d-903c-11f9b38519e2/ovsdbserver-sb/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.135755 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9c9b847d4-2fhz2_ef3a3e5f-5651-4db7-975d-9ee766a36485/placement-api/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.262430 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/init-config-reloader/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.315835 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9c9b847d4-2fhz2_ef3a3e5f-5651-4db7-975d-9ee766a36485/placement-log/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.484797 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/prometheus/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.485504 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/init-config-reloader/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.516323 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/config-reloader/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.567901 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/thanos-sidecar/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.728398 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4e022a77-723b-47bc-98c5-ad7c72aab0c3/setup-container/0.log" Oct 09 20:28:04 crc kubenswrapper[4907]: I1009 20:28:04.943424 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4e022a77-723b-47bc-98c5-ad7c72aab0c3/rabbitmq/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.007184 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4e022a77-723b-47bc-98c5-ad7c72aab0c3/setup-container/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.013160 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_274be987-64b1-4406-9f04-c81fe651d851/setup-container/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.152574 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_274be987-64b1-4406-9f04-c81fe651d851/setup-container/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.275197 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx_919363ff-2e8b-4837-828e-b5d15a180260/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.286423 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_274be987-64b1-4406-9f04-c81fe651d851/rabbitmq/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.488850 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-8zclv_0de1fb87-8f71-4b57-af90-3568d238da35/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.548192 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf_82707ebf-1ae1-4a8e-b3a3-bff2e91e707a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.693022 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7b6rw_11f08769-69d8-4b65-8684-c132bd006797/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:05 crc kubenswrapper[4907]: I1009 20:28:05.859676 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-g4gz2_68544a9c-d9f1-42c1-8499-83289992b246/ssh-known-hosts-edpm-deployment/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.024892 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78d77d8c99-ljb4q_6b2c2269-5cb3-4bf8-a162-e6a11531eca4/proxy-server/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.121011 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-b2l5z_bba198fe-5d7c-4f5c-a820-ddf9978aed83/swift-ring-rebalance/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.128962 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78d77d8c99-ljb4q_6b2c2269-5cb3-4bf8-a162-e6a11531eca4/proxy-httpd/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.298862 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.298929 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.324419 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/account-reaper/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.348415 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/account-auditor/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.369635 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/account-replicator/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.552835 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/account-server/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.573254 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/container-auditor/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.576985 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/container-server/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.629271 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/container-replicator/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.797178 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-auditor/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.809580 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/container-updater/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.845000 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-expirer/0.log" Oct 09 20:28:06 crc kubenswrapper[4907]: I1009 20:28:06.879093 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-replicator/0.log" Oct 09 20:28:07 crc kubenswrapper[4907]: I1009 20:28:07.007478 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-updater/0.log" Oct 09 20:28:07 crc kubenswrapper[4907]: I1009 20:28:07.011723 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-server/0.log" Oct 09 20:28:07 crc kubenswrapper[4907]: I1009 20:28:07.047823 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/rsync/0.log" Oct 09 20:28:07 crc kubenswrapper[4907]: I1009 20:28:07.156995 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/swift-recon-cron/0.log" Oct 09 20:28:07 crc kubenswrapper[4907]: I1009 20:28:07.344785 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx_40270666-7351-4172-b5d9-c523b405ae52/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:07 crc kubenswrapper[4907]: I1009 20:28:07.403309 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1049514e-2b9c-426e-9534-677c595d39d8/tempest-tests-tempest-tests-runner/0.log" Oct 09 20:28:07 crc kubenswrapper[4907]: I1009 20:28:07.478603 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a5f99a22-52f0-4435-b5d3-a5bd2c675b50/test-operator-logs-container/0.log" Oct 09 20:28:07 crc kubenswrapper[4907]: I1009 20:28:07.650594 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-r42b2_05a8f3e8-9742-4c16-a3a1-2695034bf94d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:28:12 crc kubenswrapper[4907]: I1009 20:28:12.883963 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_51120f59-aafb-4105-b919-fe8e4fc20f93/memcached/0.log" Oct 09 20:28:31 crc kubenswrapper[4907]: I1009 20:28:31.561006 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-jzjwr_396b2bde-8328-4285-81a3-58d361096cf8/kube-rbac-proxy/0.log" Oct 09 20:28:31 crc kubenswrapper[4907]: I1009 20:28:31.654103 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-jzjwr_396b2bde-8328-4285-81a3-58d361096cf8/manager/0.log" Oct 09 20:28:31 crc kubenswrapper[4907]: I1009 20:28:31.750392 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-5pcmk_b1701060-cf14-4dfc-9545-5b63be29728a/kube-rbac-proxy/0.log" Oct 09 20:28:31 crc kubenswrapper[4907]: I1009 20:28:31.871696 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-5pcmk_b1701060-cf14-4dfc-9545-5b63be29728a/manager/0.log" Oct 09 20:28:31 crc kubenswrapper[4907]: I1009 20:28:31.966667 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/util/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.114291 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/util/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.128405 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/pull/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.142918 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/pull/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.320339 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/util/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.332312 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/pull/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.372307 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/extract/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.524434 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-5d6rr_1353b956-2119-4690-be09-9f9b788737a5/manager/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.574953 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-5d6rr_1353b956-2119-4690-be09-9f9b788737a5/kube-rbac-proxy/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.575420 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-tpg5p_cdc3d576-f3e6-4016-8856-ff8e5e6cf299/kube-rbac-proxy/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.758747 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-d2zsg_aa1daa5a-4e9e-4378-81ad-0dab2895f34a/kube-rbac-proxy/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.800912 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-tpg5p_cdc3d576-f3e6-4016-8856-ff8e5e6cf299/manager/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.825034 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-d2zsg_aa1daa5a-4e9e-4378-81ad-0dab2895f34a/manager/0.log" Oct 09 20:28:32 crc kubenswrapper[4907]: I1009 20:28:32.944281 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-595rv_8c17b476-94f3-4391-a755-e816a5ed56e0/kube-rbac-proxy/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.025577 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-595rv_8c17b476-94f3-4391-a755-e816a5ed56e0/manager/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.161194 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-zwj6t_5870b9a9-c7a2-4e57-b917-e5a41c20dc55/kube-rbac-proxy/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.261938 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-khm2s_71138822-c6a1-4657-a640-9350e6e6965c/kube-rbac-proxy/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.348527 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-zwj6t_5870b9a9-c7a2-4e57-b917-e5a41c20dc55/manager/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.420457 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-khm2s_71138822-c6a1-4657-a640-9350e6e6965c/manager/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.479615 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hjhwf_364dc10d-b5b4-4c0e-a480-7dc371fc6a0d/kube-rbac-proxy/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.630899 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-jnbdz_9497c9e0-df89-48ae-be07-7df3e532bb35/kube-rbac-proxy/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.635163 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hjhwf_364dc10d-b5b4-4c0e-a480-7dc371fc6a0d/manager/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.689970 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-jnbdz_9497c9e0-df89-48ae-be07-7df3e532bb35/manager/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.889895 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-q2flj_e5a81d4d-968e-43b0-b53a-e5c475773a29/kube-rbac-proxy/0.log" Oct 09 20:28:33 crc kubenswrapper[4907]: I1009 20:28:33.953414 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-q2flj_e5a81d4d-968e-43b0-b53a-e5c475773a29/manager/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.236391 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-t46vt_759e961c-957a-436b-80cd-14294fce30ad/kube-rbac-proxy/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.330111 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-t46vt_759e961c-957a-436b-80cd-14294fce30ad/manager/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.390837 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-4bsn7_ddffdb06-43eb-44db-9afa-a56e2c6b467c/kube-rbac-proxy/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.502744 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-4bsn7_ddffdb06-43eb-44db-9afa-a56e2c6b467c/manager/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.611856 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-z5klh_10b627f1-74be-41c8-a7e7-367beb0a828d/kube-rbac-proxy/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.648207 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-z5klh_10b627f1-74be-41c8-a7e7-367beb0a828d/manager/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.791325 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6_523bbf58-dcf0-49f5-a198-24878c574c70/kube-rbac-proxy/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.809648 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6_523bbf58-dcf0-49f5-a198-24878c574c70/manager/0.log" Oct 09 20:28:34 crc kubenswrapper[4907]: I1009 20:28:34.898868 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d899f9cc7-5mhzg_385f8c3a-5a7a-4214-a8cf-9c6886264ea9/kube-rbac-proxy/0.log" Oct 09 20:28:35 crc kubenswrapper[4907]: I1009 20:28:35.215456 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-88fd6dc46-qtcmh_715d1a2f-5bf3-4ef7-9086-c1f450daa6eb/kube-rbac-proxy/0.log" Oct 09 20:28:35 crc kubenswrapper[4907]: I1009 20:28:35.347756 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9pktm_abdc2315-d020-4dc6-901d-75db1c33254f/registry-server/0.log" Oct 09 20:28:35 crc kubenswrapper[4907]: I1009 20:28:35.473789 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-88fd6dc46-qtcmh_715d1a2f-5bf3-4ef7-9086-c1f450daa6eb/operator/0.log" Oct 09 20:28:35 crc kubenswrapper[4907]: I1009 20:28:35.601510 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-v87t7_64d8141d-db49-44dd-90bc-20b75a642c99/kube-rbac-proxy/0.log" Oct 09 20:28:35 crc kubenswrapper[4907]: I1009 20:28:35.668843 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-v87t7_64d8141d-db49-44dd-90bc-20b75a642c99/manager/0.log" Oct 09 20:28:35 crc kubenswrapper[4907]: I1009 20:28:35.757144 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-gbxbh_94e5bb04-8b14-4518-846b-721c24bc2348/kube-rbac-proxy/0.log" Oct 09 20:28:35 crc kubenswrapper[4907]: I1009 20:28:35.909152 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-gbxbh_94e5bb04-8b14-4518-846b-721c24bc2348/manager/0.log" Oct 09 20:28:35 crc kubenswrapper[4907]: I1009 20:28:35.987832 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5_9ca2b641-57af-45b8-b0aa-3b45b08d13a7/operator/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.121812 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-8ztd5_e51f1489-6999-474e-9ae4-5f8598e608d0/kube-rbac-proxy/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.219656 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-8ztd5_e51f1489-6999-474e-9ae4-5f8598e608d0/manager/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.248137 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d899f9cc7-5mhzg_385f8c3a-5a7a-4214-a8cf-9c6886264ea9/manager/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.259139 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-895c94468-xtfng_7150c799-4a61-4c14-9471-99fbc61a8f7b/kube-rbac-proxy/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.299507 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.299579 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.483657 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-9g4cj_1313e0f0-b372-43cd-8f32-7c6bd566ab1a/manager/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.543493 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-9g4cj_1313e0f0-b372-43cd-8f32-7c6bd566ab1a/kube-rbac-proxy/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.543886 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-895c94468-xtfng_7150c799-4a61-4c14-9471-99fbc61a8f7b/manager/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.687846 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-p5c55_5681fed3-74e3-4e40-beff-cebbe06023e4/kube-rbac-proxy/0.log" Oct 09 20:28:36 crc kubenswrapper[4907]: I1009 20:28:36.719897 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-p5c55_5681fed3-74e3-4e40-beff-cebbe06023e4/manager/0.log" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.194931 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kzbw2"] Oct 09 20:28:44 crc kubenswrapper[4907]: E1009 20:28:44.195982 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bbdf113-2bb8-478e-9ebc-8e9838f7ade0" containerName="container-00" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.196001 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bbdf113-2bb8-478e-9ebc-8e9838f7ade0" containerName="container-00" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.196284 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bbdf113-2bb8-478e-9ebc-8e9838f7ade0" containerName="container-00" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.198209 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.226048 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kzbw2"] Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.405362 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-utilities\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.405423 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8r7\" (UniqueName: \"kubernetes.io/projected/9e586275-c578-4b9a-9370-5db95eef6219-kube-api-access-5k8r7\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.405714 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-catalog-content\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.507759 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8r7\" (UniqueName: \"kubernetes.io/projected/9e586275-c578-4b9a-9370-5db95eef6219-kube-api-access-5k8r7\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.507894 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-catalog-content\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.508074 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-utilities\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.508528 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-catalog-content\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.508545 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-utilities\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.533204 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8r7\" (UniqueName: \"kubernetes.io/projected/9e586275-c578-4b9a-9370-5db95eef6219-kube-api-access-5k8r7\") pod \"redhat-operators-kzbw2\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:44 crc kubenswrapper[4907]: I1009 20:28:44.830368 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:45 crc kubenswrapper[4907]: I1009 20:28:45.353534 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kzbw2"] Oct 09 20:28:46 crc kubenswrapper[4907]: I1009 20:28:46.314019 4907 generic.go:334] "Generic (PLEG): container finished" podID="9e586275-c578-4b9a-9370-5db95eef6219" containerID="49b426a61a66075a874c1aa3f68b678de30591bde1afdfc1e8a61ab5c37566e7" exitCode=0 Oct 09 20:28:46 crc kubenswrapper[4907]: I1009 20:28:46.314072 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzbw2" event={"ID":"9e586275-c578-4b9a-9370-5db95eef6219","Type":"ContainerDied","Data":"49b426a61a66075a874c1aa3f68b678de30591bde1afdfc1e8a61ab5c37566e7"} Oct 09 20:28:46 crc kubenswrapper[4907]: I1009 20:28:46.314267 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzbw2" event={"ID":"9e586275-c578-4b9a-9370-5db95eef6219","Type":"ContainerStarted","Data":"553dcd2b2443174dea186330aa8d9618a021e2b838fec7a291b5326b3fe837bf"} Oct 09 20:28:47 crc kubenswrapper[4907]: I1009 20:28:47.324368 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzbw2" event={"ID":"9e586275-c578-4b9a-9370-5db95eef6219","Type":"ContainerStarted","Data":"34b4816743e002522d9e89015b4ba4f7cb6884aa16e8dd1ced74a6ae749c3d11"} Oct 09 20:28:50 crc kubenswrapper[4907]: I1009 20:28:50.984579 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f7gtg"] Oct 09 20:28:50 crc kubenswrapper[4907]: I1009 20:28:50.987263 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.007906 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7gtg"] Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.144483 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-catalog-content\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.144941 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5dp6\" (UniqueName: \"kubernetes.io/projected/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-kube-api-access-l5dp6\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.144989 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-utilities\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.247147 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-catalog-content\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.247261 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5dp6\" (UniqueName: \"kubernetes.io/projected/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-kube-api-access-l5dp6\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.247289 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-utilities\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.247912 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-utilities\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.247966 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-catalog-content\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.275365 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5dp6\" (UniqueName: \"kubernetes.io/projected/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-kube-api-access-l5dp6\") pod \"community-operators-f7gtg\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.353128 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.365844 4907 generic.go:334] "Generic (PLEG): container finished" podID="9e586275-c578-4b9a-9370-5db95eef6219" containerID="34b4816743e002522d9e89015b4ba4f7cb6884aa16e8dd1ced74a6ae749c3d11" exitCode=0 Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.365889 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzbw2" event={"ID":"9e586275-c578-4b9a-9370-5db95eef6219","Type":"ContainerDied","Data":"34b4816743e002522d9e89015b4ba4f7cb6884aa16e8dd1ced74a6ae749c3d11"} Oct 09 20:28:51 crc kubenswrapper[4907]: I1009 20:28:51.975848 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7gtg"] Oct 09 20:28:51 crc kubenswrapper[4907]: W1009 20:28:51.979039 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c9c3c1_b96f_4623_b9f4_660c06f28fd0.slice/crio-8fae4fcef5bb881614558d7d1a8d8f2645403d6f84335e43296c8b4d078a9a79 WatchSource:0}: Error finding container 8fae4fcef5bb881614558d7d1a8d8f2645403d6f84335e43296c8b4d078a9a79: Status 404 returned error can't find the container with id 8fae4fcef5bb881614558d7d1a8d8f2645403d6f84335e43296c8b4d078a9a79 Oct 09 20:28:52 crc kubenswrapper[4907]: I1009 20:28:52.377644 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzbw2" event={"ID":"9e586275-c578-4b9a-9370-5db95eef6219","Type":"ContainerStarted","Data":"25c66b66ad57f4a8c8e856f0d788ea4b6cc0987bf80a92e04dbef79e08e3aacd"} Oct 09 20:28:52 crc kubenswrapper[4907]: I1009 20:28:52.381681 4907 generic.go:334] "Generic (PLEG): container finished" podID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerID="444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158" exitCode=0 Oct 09 20:28:52 crc kubenswrapper[4907]: I1009 20:28:52.381776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7gtg" event={"ID":"69c9c3c1-b96f-4623-b9f4-660c06f28fd0","Type":"ContainerDied","Data":"444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158"} Oct 09 20:28:52 crc kubenswrapper[4907]: I1009 20:28:52.381882 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7gtg" event={"ID":"69c9c3c1-b96f-4623-b9f4-660c06f28fd0","Type":"ContainerStarted","Data":"8fae4fcef5bb881614558d7d1a8d8f2645403d6f84335e43296c8b4d078a9a79"} Oct 09 20:28:52 crc kubenswrapper[4907]: I1009 20:28:52.408318 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kzbw2" podStartSLOduration=2.845110703 podStartE2EDuration="8.40799278s" podCreationTimestamp="2025-10-09 20:28:44 +0000 UTC" firstStartedPulling="2025-10-09 20:28:46.315879492 +0000 UTC m=+3611.847846981" lastFinishedPulling="2025-10-09 20:28:51.878761569 +0000 UTC m=+3617.410729058" observedRunningTime="2025-10-09 20:28:52.397630922 +0000 UTC m=+3617.929598411" watchObservedRunningTime="2025-10-09 20:28:52.40799278 +0000 UTC m=+3617.939960269" Oct 09 20:28:53 crc kubenswrapper[4907]: I1009 20:28:53.394336 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7gtg" event={"ID":"69c9c3c1-b96f-4623-b9f4-660c06f28fd0","Type":"ContainerStarted","Data":"02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed"} Oct 09 20:28:54 crc kubenswrapper[4907]: I1009 20:28:54.058807 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-q95gm_02b8e549-abd9-4adb-a77a-f2af6305625a/control-plane-machine-set-operator/0.log" Oct 09 20:28:54 crc kubenswrapper[4907]: I1009 20:28:54.284202 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lqqp7_4150c40f-0b19-4f81-b11c-6b19b25922b1/kube-rbac-proxy/0.log" Oct 09 20:28:54 crc kubenswrapper[4907]: I1009 20:28:54.315648 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lqqp7_4150c40f-0b19-4f81-b11c-6b19b25922b1/machine-api-operator/0.log" Oct 09 20:28:54 crc kubenswrapper[4907]: I1009 20:28:54.832397 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:54 crc kubenswrapper[4907]: I1009 20:28:54.832553 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:28:55 crc kubenswrapper[4907]: I1009 20:28:55.413588 4907 generic.go:334] "Generic (PLEG): container finished" podID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerID="02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed" exitCode=0 Oct 09 20:28:55 crc kubenswrapper[4907]: I1009 20:28:55.413681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7gtg" event={"ID":"69c9c3c1-b96f-4623-b9f4-660c06f28fd0","Type":"ContainerDied","Data":"02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed"} Oct 09 20:28:55 crc kubenswrapper[4907]: I1009 20:28:55.887808 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kzbw2" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="registry-server" probeResult="failure" output=< Oct 09 20:28:55 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Oct 09 20:28:55 crc kubenswrapper[4907]: > Oct 09 20:28:56 crc kubenswrapper[4907]: I1009 20:28:56.430958 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7gtg" event={"ID":"69c9c3c1-b96f-4623-b9f4-660c06f28fd0","Type":"ContainerStarted","Data":"41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc"} Oct 09 20:28:56 crc kubenswrapper[4907]: I1009 20:28:56.447041 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f7gtg" podStartSLOduration=2.92191148 podStartE2EDuration="6.447022294s" podCreationTimestamp="2025-10-09 20:28:50 +0000 UTC" firstStartedPulling="2025-10-09 20:28:52.383720635 +0000 UTC m=+3617.915688124" lastFinishedPulling="2025-10-09 20:28:55.908831439 +0000 UTC m=+3621.440798938" observedRunningTime="2025-10-09 20:28:56.445568628 +0000 UTC m=+3621.977536137" watchObservedRunningTime="2025-10-09 20:28:56.447022294 +0000 UTC m=+3621.978989783" Oct 09 20:29:01 crc kubenswrapper[4907]: I1009 20:29:01.353535 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:29:01 crc kubenswrapper[4907]: I1009 20:29:01.353821 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:29:01 crc kubenswrapper[4907]: I1009 20:29:01.400560 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:29:01 crc kubenswrapper[4907]: I1009 20:29:01.532803 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:29:01 crc kubenswrapper[4907]: I1009 20:29:01.635443 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7gtg"] Oct 09 20:29:03 crc kubenswrapper[4907]: I1009 20:29:03.508750 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f7gtg" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerName="registry-server" containerID="cri-o://41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc" gracePeriod=2 Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.198634 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.302707 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-utilities\") pod \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.303026 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-catalog-content\") pod \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.303089 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5dp6\" (UniqueName: \"kubernetes.io/projected/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-kube-api-access-l5dp6\") pod \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\" (UID: \"69c9c3c1-b96f-4623-b9f4-660c06f28fd0\") " Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.304343 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-utilities" (OuterVolumeSpecName: "utilities") pod "69c9c3c1-b96f-4623-b9f4-660c06f28fd0" (UID: "69c9c3c1-b96f-4623-b9f4-660c06f28fd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.309194 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-kube-api-access-l5dp6" (OuterVolumeSpecName: "kube-api-access-l5dp6") pod "69c9c3c1-b96f-4623-b9f4-660c06f28fd0" (UID: "69c9c3c1-b96f-4623-b9f4-660c06f28fd0"). InnerVolumeSpecName "kube-api-access-l5dp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.348835 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69c9c3c1-b96f-4623-b9f4-660c06f28fd0" (UID: "69c9c3c1-b96f-4623-b9f4-660c06f28fd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.405808 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.405849 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.405864 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5dp6\" (UniqueName: \"kubernetes.io/projected/69c9c3c1-b96f-4623-b9f4-660c06f28fd0-kube-api-access-l5dp6\") on node \"crc\" DevicePath \"\"" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.520426 4907 generic.go:334] "Generic (PLEG): container finished" podID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerID="41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc" exitCode=0 Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.520482 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7gtg" event={"ID":"69c9c3c1-b96f-4623-b9f4-660c06f28fd0","Type":"ContainerDied","Data":"41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc"} Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.520510 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7gtg" event={"ID":"69c9c3c1-b96f-4623-b9f4-660c06f28fd0","Type":"ContainerDied","Data":"8fae4fcef5bb881614558d7d1a8d8f2645403d6f84335e43296c8b4d078a9a79"} Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.520528 4907 scope.go:117] "RemoveContainer" containerID="41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.520579 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7gtg" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.550723 4907 scope.go:117] "RemoveContainer" containerID="02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.560227 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7gtg"] Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.569913 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f7gtg"] Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.582925 4907 scope.go:117] "RemoveContainer" containerID="444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.624361 4907 scope.go:117] "RemoveContainer" containerID="41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc" Oct 09 20:29:04 crc kubenswrapper[4907]: E1009 20:29:04.624843 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc\": container with ID starting with 41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc not found: ID does not exist" containerID="41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.624894 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc"} err="failed to get container status \"41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc\": rpc error: code = NotFound desc = could not find container \"41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc\": container with ID starting with 41e26ab7fc0904a2913be45264aa173d275d0aaa140d5911543789954d3809cc not found: ID does not exist" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.624923 4907 scope.go:117] "RemoveContainer" containerID="02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed" Oct 09 20:29:04 crc kubenswrapper[4907]: E1009 20:29:04.625218 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed\": container with ID starting with 02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed not found: ID does not exist" containerID="02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.625259 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed"} err="failed to get container status \"02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed\": rpc error: code = NotFound desc = could not find container \"02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed\": container with ID starting with 02e34d0288b99b4c82f0a87d37c19ef61704a82d5c296a001792ad3c8f2bd7ed not found: ID does not exist" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.625289 4907 scope.go:117] "RemoveContainer" containerID="444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158" Oct 09 20:29:04 crc kubenswrapper[4907]: E1009 20:29:04.625646 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158\": container with ID starting with 444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158 not found: ID does not exist" containerID="444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.625705 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158"} err="failed to get container status \"444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158\": rpc error: code = NotFound desc = could not find container \"444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158\": container with ID starting with 444bfc6cca36fd82986811039359c68afb60d7d27e9a865d739e4e4ac0c95158 not found: ID does not exist" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.888341 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:29:04 crc kubenswrapper[4907]: I1009 20:29:04.950926 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:29:05 crc kubenswrapper[4907]: I1009 20:29:05.171335 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" path="/var/lib/kubelet/pods/69c9c3c1-b96f-4623-b9f4-660c06f28fd0/volumes" Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.299720 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.299788 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.299843 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.300434 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.300518 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" gracePeriod=600 Oct 09 20:29:06 crc kubenswrapper[4907]: E1009 20:29:06.431929 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.571733 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" exitCode=0 Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.571776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226"} Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.571808 4907 scope.go:117] "RemoveContainer" containerID="96432ef9376f05f960921874ca8f70e980e9749884376496546f629d7367aada" Oct 09 20:29:06 crc kubenswrapper[4907]: I1009 20:29:06.572584 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:29:06 crc kubenswrapper[4907]: E1009 20:29:06.572867 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.032443 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kzbw2"] Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.032685 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kzbw2" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="registry-server" containerID="cri-o://25c66b66ad57f4a8c8e856f0d788ea4b6cc0987bf80a92e04dbef79e08e3aacd" gracePeriod=2 Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.076954 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-gp6g8_9000b916-e219-410f-8e0c-29d959f4527b/cert-manager-controller/0.log" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.213571 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-kmxnd_2fed30c7-d2fb-4a70-ae72-6dd33133aa94/cert-manager-webhook/0.log" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.264589 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-jxzvd_023075d5-e7bd-49f9-876a-d728fa5d66ce/cert-manager-cainjector/0.log" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.582895 4907 generic.go:334] "Generic (PLEG): container finished" podID="9e586275-c578-4b9a-9370-5db95eef6219" containerID="25c66b66ad57f4a8c8e856f0d788ea4b6cc0987bf80a92e04dbef79e08e3aacd" exitCode=0 Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.582952 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzbw2" event={"ID":"9e586275-c578-4b9a-9370-5db95eef6219","Type":"ContainerDied","Data":"25c66b66ad57f4a8c8e856f0d788ea4b6cc0987bf80a92e04dbef79e08e3aacd"} Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.736272 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.894833 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-catalog-content\") pod \"9e586275-c578-4b9a-9370-5db95eef6219\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.894915 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-utilities\") pod \"9e586275-c578-4b9a-9370-5db95eef6219\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.895137 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k8r7\" (UniqueName: \"kubernetes.io/projected/9e586275-c578-4b9a-9370-5db95eef6219-kube-api-access-5k8r7\") pod \"9e586275-c578-4b9a-9370-5db95eef6219\" (UID: \"9e586275-c578-4b9a-9370-5db95eef6219\") " Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.895417 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-utilities" (OuterVolumeSpecName: "utilities") pod "9e586275-c578-4b9a-9370-5db95eef6219" (UID: "9e586275-c578-4b9a-9370-5db95eef6219"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.896047 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.912373 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e586275-c578-4b9a-9370-5db95eef6219-kube-api-access-5k8r7" (OuterVolumeSpecName: "kube-api-access-5k8r7") pod "9e586275-c578-4b9a-9370-5db95eef6219" (UID: "9e586275-c578-4b9a-9370-5db95eef6219"). InnerVolumeSpecName "kube-api-access-5k8r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.973699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e586275-c578-4b9a-9370-5db95eef6219" (UID: "9e586275-c578-4b9a-9370-5db95eef6219"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.998255 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e586275-c578-4b9a-9370-5db95eef6219-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:29:07 crc kubenswrapper[4907]: I1009 20:29:07.998287 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k8r7\" (UniqueName: \"kubernetes.io/projected/9e586275-c578-4b9a-9370-5db95eef6219-kube-api-access-5k8r7\") on node \"crc\" DevicePath \"\"" Oct 09 20:29:08 crc kubenswrapper[4907]: I1009 20:29:08.600414 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kzbw2" event={"ID":"9e586275-c578-4b9a-9370-5db95eef6219","Type":"ContainerDied","Data":"553dcd2b2443174dea186330aa8d9618a021e2b838fec7a291b5326b3fe837bf"} Oct 09 20:29:08 crc kubenswrapper[4907]: I1009 20:29:08.600515 4907 scope.go:117] "RemoveContainer" containerID="25c66b66ad57f4a8c8e856f0d788ea4b6cc0987bf80a92e04dbef79e08e3aacd" Oct 09 20:29:08 crc kubenswrapper[4907]: I1009 20:29:08.600669 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kzbw2" Oct 09 20:29:08 crc kubenswrapper[4907]: I1009 20:29:08.627625 4907 scope.go:117] "RemoveContainer" containerID="34b4816743e002522d9e89015b4ba4f7cb6884aa16e8dd1ced74a6ae749c3d11" Oct 09 20:29:08 crc kubenswrapper[4907]: I1009 20:29:08.654370 4907 scope.go:117] "RemoveContainer" containerID="49b426a61a66075a874c1aa3f68b678de30591bde1afdfc1e8a61ab5c37566e7" Oct 09 20:29:08 crc kubenswrapper[4907]: I1009 20:29:08.656540 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kzbw2"] Oct 09 20:29:08 crc kubenswrapper[4907]: I1009 20:29:08.670516 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kzbw2"] Oct 09 20:29:09 crc kubenswrapper[4907]: I1009 20:29:09.170735 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e586275-c578-4b9a-9370-5db95eef6219" path="/var/lib/kubelet/pods/9e586275-c578-4b9a-9370-5db95eef6219/volumes" Oct 09 20:29:17 crc kubenswrapper[4907]: I1009 20:29:17.152311 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:29:17 crc kubenswrapper[4907]: E1009 20:29:17.155165 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:29:18 crc kubenswrapper[4907]: I1009 20:29:18.720905 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-2pltk_d33978d9-506b-49de-ad7c-d4fd0cb80c79/nmstate-console-plugin/0.log" Oct 09 20:29:18 crc kubenswrapper[4907]: I1009 20:29:18.930637 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-92f76_a9bc1d23-a1e5-4879-ad7d-635639d6cb12/nmstate-handler/0.log" Oct 09 20:29:18 crc kubenswrapper[4907]: I1009 20:29:18.970661 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pfsd4_bda49bd2-44dc-4a59-becb-c3942059ab4d/kube-rbac-proxy/0.log" Oct 09 20:29:19 crc kubenswrapper[4907]: I1009 20:29:19.021127 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pfsd4_bda49bd2-44dc-4a59-becb-c3942059ab4d/nmstate-metrics/0.log" Oct 09 20:29:19 crc kubenswrapper[4907]: I1009 20:29:19.110346 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-2g9nb_9c9331a0-3676-4106-957f-5699d256f0d6/nmstate-operator/0.log" Oct 09 20:29:19 crc kubenswrapper[4907]: I1009 20:29:19.212510 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-n97k6_71ed6ce6-21bd-4132-9ba3-d344520de4a9/nmstate-webhook/0.log" Oct 09 20:29:28 crc kubenswrapper[4907]: I1009 20:29:28.151195 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:29:28 crc kubenswrapper[4907]: E1009 20:29:28.151924 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:29:30 crc kubenswrapper[4907]: I1009 20:29:30.719658 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6978c7c7cf-jnfh6_584227b0-c217-4ec6-81fc-195bf4da68f3/kube-rbac-proxy/0.log" Oct 09 20:29:30 crc kubenswrapper[4907]: I1009 20:29:30.766207 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6978c7c7cf-jnfh6_584227b0-c217-4ec6-81fc-195bf4da68f3/manager/0.log" Oct 09 20:29:40 crc kubenswrapper[4907]: I1009 20:29:40.152427 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:29:40 crc kubenswrapper[4907]: E1009 20:29:40.153509 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:29:42 crc kubenswrapper[4907]: I1009 20:29:42.924368 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6sl6d_42cf9557-7cae-41c0-bbaa-a3baa099e36c/kube-rbac-proxy/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.155681 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6sl6d_42cf9557-7cae-41c0-bbaa-a3baa099e36c/controller/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.189243 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-frr-files/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.365267 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-frr-files/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.401055 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-reloader/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.414545 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-reloader/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.440531 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-metrics/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.590678 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-reloader/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.621389 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-metrics/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.622020 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-frr-files/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.664111 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-metrics/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.838803 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-reloader/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.860417 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-frr-files/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.886487 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/controller/0.log" Oct 09 20:29:43 crc kubenswrapper[4907]: I1009 20:29:43.891475 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-metrics/0.log" Oct 09 20:29:44 crc kubenswrapper[4907]: I1009 20:29:44.084833 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/frr-metrics/0.log" Oct 09 20:29:44 crc kubenswrapper[4907]: I1009 20:29:44.090758 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/kube-rbac-proxy/0.log" Oct 09 20:29:44 crc kubenswrapper[4907]: I1009 20:29:44.110443 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/kube-rbac-proxy-frr/0.log" Oct 09 20:29:44 crc kubenswrapper[4907]: I1009 20:29:44.332825 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/reloader/0.log" Oct 09 20:29:44 crc kubenswrapper[4907]: I1009 20:29:44.358478 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-2z2vr_98b09a68-cabf-431c-9885-8f6e36c84de6/frr-k8s-webhook-server/0.log" Oct 09 20:29:44 crc kubenswrapper[4907]: I1009 20:29:44.572909 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-55785946f8-t68kr_5e6d0933-34d0-4cf2-bc08-75d11b13e618/manager/0.log" Oct 09 20:29:44 crc kubenswrapper[4907]: I1009 20:29:44.758200 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5c6994cd9d-kw99q_7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b/webhook-server/0.log" Oct 09 20:29:44 crc kubenswrapper[4907]: I1009 20:29:44.898520 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m6wfh_db2ff9e3-9b97-4ec9-8b10-c782c7784b8f/kube-rbac-proxy/0.log" Oct 09 20:29:45 crc kubenswrapper[4907]: I1009 20:29:45.418137 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/frr/0.log" Oct 09 20:29:45 crc kubenswrapper[4907]: I1009 20:29:45.533911 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m6wfh_db2ff9e3-9b97-4ec9-8b10-c782c7784b8f/speaker/0.log" Oct 09 20:29:55 crc kubenswrapper[4907]: I1009 20:29:55.164961 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:29:55 crc kubenswrapper[4907]: E1009 20:29:55.165861 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:29:57 crc kubenswrapper[4907]: I1009 20:29:57.334221 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/util/0.log" Oct 09 20:29:57 crc kubenswrapper[4907]: I1009 20:29:57.473015 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/pull/0.log" Oct 09 20:29:57 crc kubenswrapper[4907]: I1009 20:29:57.492366 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/util/0.log" Oct 09 20:29:57 crc kubenswrapper[4907]: I1009 20:29:57.537485 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/pull/0.log" Oct 09 20:29:57 crc kubenswrapper[4907]: I1009 20:29:57.706210 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/extract/0.log" Oct 09 20:29:57 crc kubenswrapper[4907]: I1009 20:29:57.726034 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/pull/0.log" Oct 09 20:29:57 crc kubenswrapper[4907]: I1009 20:29:57.741141 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/util/0.log" Oct 09 20:29:57 crc kubenswrapper[4907]: I1009 20:29:57.872065 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/util/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.013477 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/util/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.060621 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/pull/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.070018 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/pull/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.197600 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/extract/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.233626 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/util/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.262474 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/pull/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.381120 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/util/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.556761 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/util/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.567146 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/pull/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.576215 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/pull/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.769459 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/util/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.775899 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/pull/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.782116 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/extract/0.log" Oct 09 20:29:58 crc kubenswrapper[4907]: I1009 20:29:58.947005 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/util/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.133025 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/util/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.168601 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/pull/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.168833 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/pull/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.334710 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/pull/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.345706 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/util/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.351400 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/extract/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.506178 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-utilities/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.658649 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-content/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.687710 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-utilities/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.703192 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-content/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.878798 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-utilities/0.log" Oct 09 20:29:59 crc kubenswrapper[4907]: I1009 20:29:59.906138 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-content/0.log" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.139958 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-utilities/0.log" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.177887 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95"] Oct 09 20:30:00 crc kubenswrapper[4907]: E1009 20:30:00.178383 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerName="extract-content" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.178403 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerName="extract-content" Oct 09 20:30:00 crc kubenswrapper[4907]: E1009 20:30:00.178423 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerName="registry-server" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.178430 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerName="registry-server" Oct 09 20:30:00 crc kubenswrapper[4907]: E1009 20:30:00.178447 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="extract-content" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.178456 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="extract-content" Oct 09 20:30:00 crc kubenswrapper[4907]: E1009 20:30:00.179103 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="extract-utilities" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.179123 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="extract-utilities" Oct 09 20:30:00 crc kubenswrapper[4907]: E1009 20:30:00.179137 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="registry-server" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.179144 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="registry-server" Oct 09 20:30:00 crc kubenswrapper[4907]: E1009 20:30:00.179163 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerName="extract-utilities" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.179170 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerName="extract-utilities" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.179480 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c9c3c1-b96f-4623-b9f4-660c06f28fd0" containerName="registry-server" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.179510 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e586275-c578-4b9a-9370-5db95eef6219" containerName="registry-server" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.180499 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.189661 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.190219 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.198985 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95"] Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.295418 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42116514-b612-41ac-be40-e00d07564c50-config-volume\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.295602 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42116514-b612-41ac-be40-e00d07564c50-secret-volume\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.295680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbnzh\" (UniqueName: \"kubernetes.io/projected/42116514-b612-41ac-be40-e00d07564c50-kube-api-access-nbnzh\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.399846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbnzh\" (UniqueName: \"kubernetes.io/projected/42116514-b612-41ac-be40-e00d07564c50-kube-api-access-nbnzh\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.400000 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42116514-b612-41ac-be40-e00d07564c50-config-volume\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.400145 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42116514-b612-41ac-be40-e00d07564c50-secret-volume\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.401221 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42116514-b612-41ac-be40-e00d07564c50-config-volume\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.424130 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42116514-b612-41ac-be40-e00d07564c50-secret-volume\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.485267 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbnzh\" (UniqueName: \"kubernetes.io/projected/42116514-b612-41ac-be40-e00d07564c50-kube-api-access-nbnzh\") pod \"collect-profiles-29334030-56t95\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.520237 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.681062 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-content/0.log" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.692387 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-utilities/0.log" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.696861 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-content/0.log" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.709993 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/registry-server/0.log" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.855053 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-utilities/0.log" Oct 09 20:30:00 crc kubenswrapper[4907]: I1009 20:30:00.936710 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-content/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.018458 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/util/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.054896 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95"] Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.195104 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/pull/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.195105 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" event={"ID":"42116514-b612-41ac-be40-e00d07564c50","Type":"ContainerStarted","Data":"31a749145d2ad2ad57c2be8d053baa85a7ee7865de32d0e3fe3f08771d338a87"} Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.250395 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/util/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.314519 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/pull/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.461689 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/util/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.502165 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/pull/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.613204 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/extract/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.624906 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/registry-server/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.648066 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-dmrx4_d043494b-98ab-482a-ba53-5f2445d01bea/marketplace-operator/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.748301 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-utilities/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.903023 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-content/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.934609 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-utilities/0.log" Oct 09 20:30:01 crc kubenswrapper[4907]: I1009 20:30:01.960124 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-content/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.106868 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-utilities/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.110657 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-content/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.171416 4907 generic.go:334] "Generic (PLEG): container finished" podID="42116514-b612-41ac-be40-e00d07564c50" containerID="c2e0e0c084f49a7c07cfd555bb28aba30a4c9bf5ade954880edadfeacb353477" exitCode=0 Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.171473 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" event={"ID":"42116514-b612-41ac-be40-e00d07564c50","Type":"ContainerDied","Data":"c2e0e0c084f49a7c07cfd555bb28aba30a4c9bf5ade954880edadfeacb353477"} Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.186152 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-utilities/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.293951 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/registry-server/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.344432 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-content/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.385658 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-utilities/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.386035 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-content/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.582886 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-utilities/0.log" Oct 09 20:30:02 crc kubenswrapper[4907]: I1009 20:30:02.592644 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-content/0.log" Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.201592 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/registry-server/0.log" Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.716515 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.869644 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42116514-b612-41ac-be40-e00d07564c50-config-volume\") pod \"42116514-b612-41ac-be40-e00d07564c50\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.869702 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42116514-b612-41ac-be40-e00d07564c50-secret-volume\") pod \"42116514-b612-41ac-be40-e00d07564c50\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.869878 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbnzh\" (UniqueName: \"kubernetes.io/projected/42116514-b612-41ac-be40-e00d07564c50-kube-api-access-nbnzh\") pod \"42116514-b612-41ac-be40-e00d07564c50\" (UID: \"42116514-b612-41ac-be40-e00d07564c50\") " Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.870409 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42116514-b612-41ac-be40-e00d07564c50-config-volume" (OuterVolumeSpecName: "config-volume") pod "42116514-b612-41ac-be40-e00d07564c50" (UID: "42116514-b612-41ac-be40-e00d07564c50"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.875261 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42116514-b612-41ac-be40-e00d07564c50-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42116514-b612-41ac-be40-e00d07564c50" (UID: "42116514-b612-41ac-be40-e00d07564c50"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.877768 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42116514-b612-41ac-be40-e00d07564c50-kube-api-access-nbnzh" (OuterVolumeSpecName: "kube-api-access-nbnzh") pod "42116514-b612-41ac-be40-e00d07564c50" (UID: "42116514-b612-41ac-be40-e00d07564c50"). InnerVolumeSpecName "kube-api-access-nbnzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.972840 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42116514-b612-41ac-be40-e00d07564c50-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.972871 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42116514-b612-41ac-be40-e00d07564c50-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 20:30:03 crc kubenswrapper[4907]: I1009 20:30:03.972882 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbnzh\" (UniqueName: \"kubernetes.io/projected/42116514-b612-41ac-be40-e00d07564c50-kube-api-access-nbnzh\") on node \"crc\" DevicePath \"\"" Oct 09 20:30:04 crc kubenswrapper[4907]: I1009 20:30:04.196592 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" event={"ID":"42116514-b612-41ac-be40-e00d07564c50","Type":"ContainerDied","Data":"31a749145d2ad2ad57c2be8d053baa85a7ee7865de32d0e3fe3f08771d338a87"} Oct 09 20:30:04 crc kubenswrapper[4907]: I1009 20:30:04.196874 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31a749145d2ad2ad57c2be8d053baa85a7ee7865de32d0e3fe3f08771d338a87" Oct 09 20:30:04 crc kubenswrapper[4907]: I1009 20:30:04.196799 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334030-56t95" Oct 09 20:30:04 crc kubenswrapper[4907]: I1009 20:30:04.784562 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf"] Oct 09 20:30:04 crc kubenswrapper[4907]: I1009 20:30:04.793097 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333985-7ktjf"] Oct 09 20:30:05 crc kubenswrapper[4907]: I1009 20:30:05.197053 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="072c0854-7d0a-4f81-8060-50e85811eeb0" path="/var/lib/kubelet/pods/072c0854-7d0a-4f81-8060-50e85811eeb0/volumes" Oct 09 20:30:09 crc kubenswrapper[4907]: I1009 20:30:09.152192 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:30:09 crc kubenswrapper[4907]: E1009 20:30:09.153025 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:30:14 crc kubenswrapper[4907]: I1009 20:30:14.658899 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-vfrf8_df906780-9fa6-4336-8b74-dd4061587bfe/prometheus-operator/0.log" Oct 09 20:30:14 crc kubenswrapper[4907]: I1009 20:30:14.683362 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_d6b76317-83ef-4bab-b4bd-7940ca0c954e/prometheus-operator-admission-webhook/0.log" Oct 09 20:30:14 crc kubenswrapper[4907]: I1009 20:30:14.811882 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_d1d14ebb-33ce-4f94-b224-f267661a1704/prometheus-operator-admission-webhook/0.log" Oct 09 20:30:14 crc kubenswrapper[4907]: I1009 20:30:14.916647 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-qp2p6_8f351fc6-080d-41c9-ab41-44dc032b6579/operator/0.log" Oct 09 20:30:15 crc kubenswrapper[4907]: I1009 20:30:15.010404 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-csbcg_2bb9bd82-399c-43cb-aad5-37832f57ba4f/perses-operator/0.log" Oct 09 20:30:23 crc kubenswrapper[4907]: I1009 20:30:23.151903 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:30:23 crc kubenswrapper[4907]: E1009 20:30:23.152814 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:30:26 crc kubenswrapper[4907]: I1009 20:30:26.393677 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6978c7c7cf-jnfh6_584227b0-c217-4ec6-81fc-195bf4da68f3/manager/0.log" Oct 09 20:30:26 crc kubenswrapper[4907]: I1009 20:30:26.430453 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6978c7c7cf-jnfh6_584227b0-c217-4ec6-81fc-195bf4da68f3/kube-rbac-proxy/0.log" Oct 09 20:30:35 crc kubenswrapper[4907]: I1009 20:30:35.159121 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:30:35 crc kubenswrapper[4907]: E1009 20:30:35.159799 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:30:42 crc kubenswrapper[4907]: I1009 20:30:42.515970 4907 scope.go:117] "RemoveContainer" containerID="83dea5c6e16336dcab69d4448d09f6308f45fea94c1620c2384a6edeb8ddc627" Oct 09 20:30:47 crc kubenswrapper[4907]: I1009 20:30:47.151208 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:30:47 crc kubenswrapper[4907]: E1009 20:30:47.152009 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:31:00 crc kubenswrapper[4907]: I1009 20:31:00.152307 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:31:00 crc kubenswrapper[4907]: E1009 20:31:00.153425 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:31:13 crc kubenswrapper[4907]: I1009 20:31:13.156321 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:31:13 crc kubenswrapper[4907]: E1009 20:31:13.157117 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:31:25 crc kubenswrapper[4907]: I1009 20:31:25.159677 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:31:25 crc kubenswrapper[4907]: E1009 20:31:25.161724 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:31:39 crc kubenswrapper[4907]: I1009 20:31:39.161313 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:31:39 crc kubenswrapper[4907]: E1009 20:31:39.162248 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:31:51 crc kubenswrapper[4907]: I1009 20:31:51.151393 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:31:51 crc kubenswrapper[4907]: E1009 20:31:51.152099 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:32:02 crc kubenswrapper[4907]: I1009 20:32:02.151699 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:32:02 crc kubenswrapper[4907]: E1009 20:32:02.152790 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:32:02 crc kubenswrapper[4907]: I1009 20:32:02.404689 4907 generic.go:334] "Generic (PLEG): container finished" podID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerID="250fa350d6555d9d1e98f788e8c9509dcf7f1cf6c2e5d3761241515fd83e403e" exitCode=0 Oct 09 20:32:02 crc kubenswrapper[4907]: I1009 20:32:02.404749 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" event={"ID":"54ed3cdb-9565-4544-90bb-13b7d3088dc3","Type":"ContainerDied","Data":"250fa350d6555d9d1e98f788e8c9509dcf7f1cf6c2e5d3761241515fd83e403e"} Oct 09 20:32:02 crc kubenswrapper[4907]: I1009 20:32:02.405660 4907 scope.go:117] "RemoveContainer" containerID="250fa350d6555d9d1e98f788e8c9509dcf7f1cf6c2e5d3761241515fd83e403e" Oct 09 20:32:02 crc kubenswrapper[4907]: I1009 20:32:02.478149 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cqxd5_must-gather-rzcbv_54ed3cdb-9565-4544-90bb-13b7d3088dc3/gather/0.log" Oct 09 20:32:10 crc kubenswrapper[4907]: I1009 20:32:10.892419 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cqxd5/must-gather-rzcbv"] Oct 09 20:32:10 crc kubenswrapper[4907]: I1009 20:32:10.893111 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" podUID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerName="copy" containerID="cri-o://81c014e53e1281464ef861e52cf1aec5dd96eea91ad041dba9aa4d5aed27c523" gracePeriod=2 Oct 09 20:32:10 crc kubenswrapper[4907]: I1009 20:32:10.904702 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cqxd5/must-gather-rzcbv"] Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.499542 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cqxd5_must-gather-rzcbv_54ed3cdb-9565-4544-90bb-13b7d3088dc3/copy/0.log" Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.500138 4907 generic.go:334] "Generic (PLEG): container finished" podID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerID="81c014e53e1281464ef861e52cf1aec5dd96eea91ad041dba9aa4d5aed27c523" exitCode=143 Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.500196 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7abdbe73228db98085c6108cb64e791d5972e7cc678f41c9d2a6dfd4045e0d86" Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.527137 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cqxd5_must-gather-rzcbv_54ed3cdb-9565-4544-90bb-13b7d3088dc3/copy/0.log" Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.527475 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.684758 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54ed3cdb-9565-4544-90bb-13b7d3088dc3-must-gather-output\") pod \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\" (UID: \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\") " Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.685208 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj9f4\" (UniqueName: \"kubernetes.io/projected/54ed3cdb-9565-4544-90bb-13b7d3088dc3-kube-api-access-qj9f4\") pod \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\" (UID: \"54ed3cdb-9565-4544-90bb-13b7d3088dc3\") " Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.694237 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ed3cdb-9565-4544-90bb-13b7d3088dc3-kube-api-access-qj9f4" (OuterVolumeSpecName: "kube-api-access-qj9f4") pod "54ed3cdb-9565-4544-90bb-13b7d3088dc3" (UID: "54ed3cdb-9565-4544-90bb-13b7d3088dc3"). InnerVolumeSpecName "kube-api-access-qj9f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.788879 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj9f4\" (UniqueName: \"kubernetes.io/projected/54ed3cdb-9565-4544-90bb-13b7d3088dc3-kube-api-access-qj9f4\") on node \"crc\" DevicePath \"\"" Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.820824 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ed3cdb-9565-4544-90bb-13b7d3088dc3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "54ed3cdb-9565-4544-90bb-13b7d3088dc3" (UID: "54ed3cdb-9565-4544-90bb-13b7d3088dc3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:32:11 crc kubenswrapper[4907]: I1009 20:32:11.890326 4907 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54ed3cdb-9565-4544-90bb-13b7d3088dc3-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 20:32:12 crc kubenswrapper[4907]: I1009 20:32:12.507249 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxd5/must-gather-rzcbv" Oct 09 20:32:13 crc kubenswrapper[4907]: I1009 20:32:13.162665 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" path="/var/lib/kubelet/pods/54ed3cdb-9565-4544-90bb-13b7d3088dc3/volumes" Oct 09 20:32:16 crc kubenswrapper[4907]: I1009 20:32:16.151994 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:32:16 crc kubenswrapper[4907]: E1009 20:32:16.152976 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:32:30 crc kubenswrapper[4907]: I1009 20:32:30.151077 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:32:30 crc kubenswrapper[4907]: E1009 20:32:30.151799 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:32:42 crc kubenswrapper[4907]: I1009 20:32:42.602719 4907 scope.go:117] "RemoveContainer" containerID="81c014e53e1281464ef861e52cf1aec5dd96eea91ad041dba9aa4d5aed27c523" Oct 09 20:32:42 crc kubenswrapper[4907]: I1009 20:32:42.627678 4907 scope.go:117] "RemoveContainer" containerID="250fa350d6555d9d1e98f788e8c9509dcf7f1cf6c2e5d3761241515fd83e403e" Oct 09 20:32:44 crc kubenswrapper[4907]: I1009 20:32:44.151374 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:32:44 crc kubenswrapper[4907]: E1009 20:32:44.152174 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.737968 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4nw95/must-gather-mhx4n"] Oct 09 20:32:50 crc kubenswrapper[4907]: E1009 20:32:50.739105 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerName="gather" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.739122 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerName="gather" Oct 09 20:32:50 crc kubenswrapper[4907]: E1009 20:32:50.739141 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerName="copy" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.739148 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerName="copy" Oct 09 20:32:50 crc kubenswrapper[4907]: E1009 20:32:50.739184 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42116514-b612-41ac-be40-e00d07564c50" containerName="collect-profiles" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.739194 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="42116514-b612-41ac-be40-e00d07564c50" containerName="collect-profiles" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.739455 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerName="gather" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.739490 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ed3cdb-9565-4544-90bb-13b7d3088dc3" containerName="copy" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.739512 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="42116514-b612-41ac-be40-e00d07564c50" containerName="collect-profiles" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.741017 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.747604 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4nw95"/"openshift-service-ca.crt" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.747614 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4nw95"/"default-dockercfg-6qlk6" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.747770 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4nw95"/"kube-root-ca.crt" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.775554 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4nw95/must-gather-mhx4n"] Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.914644 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4k5q\" (UniqueName: \"kubernetes.io/projected/ae39e144-b0c3-451f-ac6a-d4638f209ed8-kube-api-access-w4k5q\") pod \"must-gather-mhx4n\" (UID: \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\") " pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:32:50 crc kubenswrapper[4907]: I1009 20:32:50.914750 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae39e144-b0c3-451f-ac6a-d4638f209ed8-must-gather-output\") pod \"must-gather-mhx4n\" (UID: \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\") " pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:32:51 crc kubenswrapper[4907]: I1009 20:32:51.016967 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4k5q\" (UniqueName: \"kubernetes.io/projected/ae39e144-b0c3-451f-ac6a-d4638f209ed8-kube-api-access-w4k5q\") pod \"must-gather-mhx4n\" (UID: \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\") " pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:32:51 crc kubenswrapper[4907]: I1009 20:32:51.017075 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae39e144-b0c3-451f-ac6a-d4638f209ed8-must-gather-output\") pod \"must-gather-mhx4n\" (UID: \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\") " pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:32:51 crc kubenswrapper[4907]: I1009 20:32:51.017531 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae39e144-b0c3-451f-ac6a-d4638f209ed8-must-gather-output\") pod \"must-gather-mhx4n\" (UID: \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\") " pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:32:51 crc kubenswrapper[4907]: I1009 20:32:51.043367 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4k5q\" (UniqueName: \"kubernetes.io/projected/ae39e144-b0c3-451f-ac6a-d4638f209ed8-kube-api-access-w4k5q\") pod \"must-gather-mhx4n\" (UID: \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\") " pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:32:51 crc kubenswrapper[4907]: I1009 20:32:51.062894 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:32:51 crc kubenswrapper[4907]: I1009 20:32:51.573323 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4nw95/must-gather-mhx4n"] Oct 09 20:32:51 crc kubenswrapper[4907]: I1009 20:32:51.907919 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/must-gather-mhx4n" event={"ID":"ae39e144-b0c3-451f-ac6a-d4638f209ed8","Type":"ContainerStarted","Data":"ac93c9c715dce26ed1f16fc0fa27a540ac91d6551bbb084d9178bbb6bc41b90b"} Oct 09 20:32:52 crc kubenswrapper[4907]: I1009 20:32:52.919329 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/must-gather-mhx4n" event={"ID":"ae39e144-b0c3-451f-ac6a-d4638f209ed8","Type":"ContainerStarted","Data":"9727f825f6da7b4af12dc1fa904927e38ce64af5f4c8398c1c4deeae92af94c8"} Oct 09 20:32:52 crc kubenswrapper[4907]: I1009 20:32:52.919723 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/must-gather-mhx4n" event={"ID":"ae39e144-b0c3-451f-ac6a-d4638f209ed8","Type":"ContainerStarted","Data":"d47234031804beb9b6e0605be223a96a9cd0e31108794806a0b6156140b5cb8f"} Oct 09 20:32:52 crc kubenswrapper[4907]: I1009 20:32:52.944011 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4nw95/must-gather-mhx4n" podStartSLOduration=2.943989758 podStartE2EDuration="2.943989758s" podCreationTimestamp="2025-10-09 20:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 20:32:52.938180153 +0000 UTC m=+3858.470147662" watchObservedRunningTime="2025-10-09 20:32:52.943989758 +0000 UTC m=+3858.475957247" Oct 09 20:32:55 crc kubenswrapper[4907]: I1009 20:32:55.682952 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4nw95/crc-debug-jnhjk"] Oct 09 20:32:55 crc kubenswrapper[4907]: I1009 20:32:55.684704 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:32:55 crc kubenswrapper[4907]: I1009 20:32:55.817319 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjzt\" (UniqueName: \"kubernetes.io/projected/ad481fd3-36d8-4cfa-885f-3b3719643049-kube-api-access-czjzt\") pod \"crc-debug-jnhjk\" (UID: \"ad481fd3-36d8-4cfa-885f-3b3719643049\") " pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:32:55 crc kubenswrapper[4907]: I1009 20:32:55.817672 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad481fd3-36d8-4cfa-885f-3b3719643049-host\") pod \"crc-debug-jnhjk\" (UID: \"ad481fd3-36d8-4cfa-885f-3b3719643049\") " pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:32:55 crc kubenswrapper[4907]: I1009 20:32:55.919509 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czjzt\" (UniqueName: \"kubernetes.io/projected/ad481fd3-36d8-4cfa-885f-3b3719643049-kube-api-access-czjzt\") pod \"crc-debug-jnhjk\" (UID: \"ad481fd3-36d8-4cfa-885f-3b3719643049\") " pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:32:55 crc kubenswrapper[4907]: I1009 20:32:55.919569 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad481fd3-36d8-4cfa-885f-3b3719643049-host\") pod \"crc-debug-jnhjk\" (UID: \"ad481fd3-36d8-4cfa-885f-3b3719643049\") " pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:32:55 crc kubenswrapper[4907]: I1009 20:32:55.919863 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad481fd3-36d8-4cfa-885f-3b3719643049-host\") pod \"crc-debug-jnhjk\" (UID: \"ad481fd3-36d8-4cfa-885f-3b3719643049\") " pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:32:55 crc kubenswrapper[4907]: I1009 20:32:55.939694 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjzt\" (UniqueName: \"kubernetes.io/projected/ad481fd3-36d8-4cfa-885f-3b3719643049-kube-api-access-czjzt\") pod \"crc-debug-jnhjk\" (UID: \"ad481fd3-36d8-4cfa-885f-3b3719643049\") " pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:32:56 crc kubenswrapper[4907]: I1009 20:32:56.004806 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:32:56 crc kubenswrapper[4907]: W1009 20:32:56.106309 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad481fd3_36d8_4cfa_885f_3b3719643049.slice/crio-eb51250008a41a09ea49896d838d3cf4eeb13dc97f53de4e38203957afcd9da6 WatchSource:0}: Error finding container eb51250008a41a09ea49896d838d3cf4eeb13dc97f53de4e38203957afcd9da6: Status 404 returned error can't find the container with id eb51250008a41a09ea49896d838d3cf4eeb13dc97f53de4e38203957afcd9da6 Oct 09 20:32:56 crc kubenswrapper[4907]: I1009 20:32:56.952957 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/crc-debug-jnhjk" event={"ID":"ad481fd3-36d8-4cfa-885f-3b3719643049","Type":"ContainerStarted","Data":"af10ac8aa4a0d51b83923d65b3829c953092d60b9d7f43dc13920ff506a64022"} Oct 09 20:32:56 crc kubenswrapper[4907]: I1009 20:32:56.953578 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/crc-debug-jnhjk" event={"ID":"ad481fd3-36d8-4cfa-885f-3b3719643049","Type":"ContainerStarted","Data":"eb51250008a41a09ea49896d838d3cf4eeb13dc97f53de4e38203957afcd9da6"} Oct 09 20:32:56 crc kubenswrapper[4907]: I1009 20:32:56.973343 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4nw95/crc-debug-jnhjk" podStartSLOduration=1.973324498 podStartE2EDuration="1.973324498s" podCreationTimestamp="2025-10-09 20:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 20:32:56.968347674 +0000 UTC m=+3862.500315163" watchObservedRunningTime="2025-10-09 20:32:56.973324498 +0000 UTC m=+3862.505291987" Oct 09 20:32:57 crc kubenswrapper[4907]: I1009 20:32:57.151306 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:32:57 crc kubenswrapper[4907]: E1009 20:32:57.151626 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:32:57 crc kubenswrapper[4907]: E1009 20:32:57.487583 4907 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.104:46606->38.102.83.104:46823: read tcp 38.102.83.104:46606->38.102.83.104:46823: read: connection reset by peer Oct 09 20:33:12 crc kubenswrapper[4907]: I1009 20:33:12.152733 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:33:12 crc kubenswrapper[4907]: E1009 20:33:12.154672 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:33:23 crc kubenswrapper[4907]: I1009 20:33:23.152017 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:33:23 crc kubenswrapper[4907]: E1009 20:33:23.152890 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:33:35 crc kubenswrapper[4907]: I1009 20:33:35.153395 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:33:35 crc kubenswrapper[4907]: E1009 20:33:35.154240 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:33:35 crc kubenswrapper[4907]: I1009 20:33:35.377014 4907 generic.go:334] "Generic (PLEG): container finished" podID="ad481fd3-36d8-4cfa-885f-3b3719643049" containerID="af10ac8aa4a0d51b83923d65b3829c953092d60b9d7f43dc13920ff506a64022" exitCode=0 Oct 09 20:33:35 crc kubenswrapper[4907]: I1009 20:33:35.377088 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/crc-debug-jnhjk" event={"ID":"ad481fd3-36d8-4cfa-885f-3b3719643049","Type":"ContainerDied","Data":"af10ac8aa4a0d51b83923d65b3829c953092d60b9d7f43dc13920ff506a64022"} Oct 09 20:33:36 crc kubenswrapper[4907]: I1009 20:33:36.900309 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:33:36 crc kubenswrapper[4907]: I1009 20:33:36.965028 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4nw95/crc-debug-jnhjk"] Oct 09 20:33:36 crc kubenswrapper[4907]: I1009 20:33:36.985491 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4nw95/crc-debug-jnhjk"] Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.004860 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad481fd3-36d8-4cfa-885f-3b3719643049-host\") pod \"ad481fd3-36d8-4cfa-885f-3b3719643049\" (UID: \"ad481fd3-36d8-4cfa-885f-3b3719643049\") " Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.005095 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad481fd3-36d8-4cfa-885f-3b3719643049-host" (OuterVolumeSpecName: "host") pod "ad481fd3-36d8-4cfa-885f-3b3719643049" (UID: "ad481fd3-36d8-4cfa-885f-3b3719643049"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.005337 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czjzt\" (UniqueName: \"kubernetes.io/projected/ad481fd3-36d8-4cfa-885f-3b3719643049-kube-api-access-czjzt\") pod \"ad481fd3-36d8-4cfa-885f-3b3719643049\" (UID: \"ad481fd3-36d8-4cfa-885f-3b3719643049\") " Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.005958 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad481fd3-36d8-4cfa-885f-3b3719643049-host\") on node \"crc\" DevicePath \"\"" Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.012651 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad481fd3-36d8-4cfa-885f-3b3719643049-kube-api-access-czjzt" (OuterVolumeSpecName: "kube-api-access-czjzt") pod "ad481fd3-36d8-4cfa-885f-3b3719643049" (UID: "ad481fd3-36d8-4cfa-885f-3b3719643049"). InnerVolumeSpecName "kube-api-access-czjzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.107604 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czjzt\" (UniqueName: \"kubernetes.io/projected/ad481fd3-36d8-4cfa-885f-3b3719643049-kube-api-access-czjzt\") on node \"crc\" DevicePath \"\"" Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.165920 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad481fd3-36d8-4cfa-885f-3b3719643049" path="/var/lib/kubelet/pods/ad481fd3-36d8-4cfa-885f-3b3719643049/volumes" Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.396208 4907 scope.go:117] "RemoveContainer" containerID="af10ac8aa4a0d51b83923d65b3829c953092d60b9d7f43dc13920ff506a64022" Oct 09 20:33:37 crc kubenswrapper[4907]: I1009 20:33:37.396281 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-jnhjk" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.140211 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4nw95/crc-debug-m892b"] Oct 09 20:33:38 crc kubenswrapper[4907]: E1009 20:33:38.140763 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad481fd3-36d8-4cfa-885f-3b3719643049" containerName="container-00" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.140781 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad481fd3-36d8-4cfa-885f-3b3719643049" containerName="container-00" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.141037 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad481fd3-36d8-4cfa-885f-3b3719643049" containerName="container-00" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.142169 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.229587 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/767a01ed-9363-43c2-9604-27a2fc988916-host\") pod \"crc-debug-m892b\" (UID: \"767a01ed-9363-43c2-9604-27a2fc988916\") " pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.230825 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxtr7\" (UniqueName: \"kubernetes.io/projected/767a01ed-9363-43c2-9604-27a2fc988916-kube-api-access-zxtr7\") pod \"crc-debug-m892b\" (UID: \"767a01ed-9363-43c2-9604-27a2fc988916\") " pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.333239 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/767a01ed-9363-43c2-9604-27a2fc988916-host\") pod \"crc-debug-m892b\" (UID: \"767a01ed-9363-43c2-9604-27a2fc988916\") " pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.333354 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxtr7\" (UniqueName: \"kubernetes.io/projected/767a01ed-9363-43c2-9604-27a2fc988916-kube-api-access-zxtr7\") pod \"crc-debug-m892b\" (UID: \"767a01ed-9363-43c2-9604-27a2fc988916\") " pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.333391 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/767a01ed-9363-43c2-9604-27a2fc988916-host\") pod \"crc-debug-m892b\" (UID: \"767a01ed-9363-43c2-9604-27a2fc988916\") " pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.363233 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxtr7\" (UniqueName: \"kubernetes.io/projected/767a01ed-9363-43c2-9604-27a2fc988916-kube-api-access-zxtr7\") pod \"crc-debug-m892b\" (UID: \"767a01ed-9363-43c2-9604-27a2fc988916\") " pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:38 crc kubenswrapper[4907]: I1009 20:33:38.458298 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:39 crc kubenswrapper[4907]: I1009 20:33:39.414674 4907 generic.go:334] "Generic (PLEG): container finished" podID="767a01ed-9363-43c2-9604-27a2fc988916" containerID="82b88dd42a1872b78b6ae7298f6d246f72ae650af981740425b0e3d0d3a473e0" exitCode=0 Oct 09 20:33:39 crc kubenswrapper[4907]: I1009 20:33:39.414770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/crc-debug-m892b" event={"ID":"767a01ed-9363-43c2-9604-27a2fc988916","Type":"ContainerDied","Data":"82b88dd42a1872b78b6ae7298f6d246f72ae650af981740425b0e3d0d3a473e0"} Oct 09 20:33:39 crc kubenswrapper[4907]: I1009 20:33:39.415080 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/crc-debug-m892b" event={"ID":"767a01ed-9363-43c2-9604-27a2fc988916","Type":"ContainerStarted","Data":"8f9c00107064a5ceb7477db20766b0a09842422acf45db24ad82ec0907f0eb8f"} Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.481301 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4nw95/crc-debug-m892b"] Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.489364 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4nw95/crc-debug-m892b"] Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.534558 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.578100 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/767a01ed-9363-43c2-9604-27a2fc988916-host\") pod \"767a01ed-9363-43c2-9604-27a2fc988916\" (UID: \"767a01ed-9363-43c2-9604-27a2fc988916\") " Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.578156 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/767a01ed-9363-43c2-9604-27a2fc988916-host" (OuterVolumeSpecName: "host") pod "767a01ed-9363-43c2-9604-27a2fc988916" (UID: "767a01ed-9363-43c2-9604-27a2fc988916"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.578368 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxtr7\" (UniqueName: \"kubernetes.io/projected/767a01ed-9363-43c2-9604-27a2fc988916-kube-api-access-zxtr7\") pod \"767a01ed-9363-43c2-9604-27a2fc988916\" (UID: \"767a01ed-9363-43c2-9604-27a2fc988916\") " Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.578861 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/767a01ed-9363-43c2-9604-27a2fc988916-host\") on node \"crc\" DevicePath \"\"" Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.584530 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767a01ed-9363-43c2-9604-27a2fc988916-kube-api-access-zxtr7" (OuterVolumeSpecName: "kube-api-access-zxtr7") pod "767a01ed-9363-43c2-9604-27a2fc988916" (UID: "767a01ed-9363-43c2-9604-27a2fc988916"). InnerVolumeSpecName "kube-api-access-zxtr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:33:40 crc kubenswrapper[4907]: I1009 20:33:40.680433 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxtr7\" (UniqueName: \"kubernetes.io/projected/767a01ed-9363-43c2-9604-27a2fc988916-kube-api-access-zxtr7\") on node \"crc\" DevicePath \"\"" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.163285 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="767a01ed-9363-43c2-9604-27a2fc988916" path="/var/lib/kubelet/pods/767a01ed-9363-43c2-9604-27a2fc988916/volumes" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.438131 4907 scope.go:117] "RemoveContainer" containerID="82b88dd42a1872b78b6ae7298f6d246f72ae650af981740425b0e3d0d3a473e0" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.438617 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-m892b" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.783588 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4nw95/crc-debug-tnkmc"] Oct 09 20:33:41 crc kubenswrapper[4907]: E1009 20:33:41.784219 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767a01ed-9363-43c2-9604-27a2fc988916" containerName="container-00" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.784243 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="767a01ed-9363-43c2-9604-27a2fc988916" containerName="container-00" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.784657 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="767a01ed-9363-43c2-9604-27a2fc988916" containerName="container-00" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.785691 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.904828 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkltp\" (UniqueName: \"kubernetes.io/projected/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-kube-api-access-jkltp\") pod \"crc-debug-tnkmc\" (UID: \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\") " pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:41 crc kubenswrapper[4907]: I1009 20:33:41.905234 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-host\") pod \"crc-debug-tnkmc\" (UID: \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\") " pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.007319 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkltp\" (UniqueName: \"kubernetes.io/projected/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-kube-api-access-jkltp\") pod \"crc-debug-tnkmc\" (UID: \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\") " pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.007475 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-host\") pod \"crc-debug-tnkmc\" (UID: \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\") " pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.007609 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-host\") pod \"crc-debug-tnkmc\" (UID: \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\") " pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.032582 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkltp\" (UniqueName: \"kubernetes.io/projected/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-kube-api-access-jkltp\") pod \"crc-debug-tnkmc\" (UID: \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\") " pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.107986 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.450643 4907 generic.go:334] "Generic (PLEG): container finished" podID="fe39de0a-5da9-4f39-8ce9-17edcd52cb18" containerID="68f18086539432ba629af602580f8bc4bdbe815663cf8214bcb2a781aed63d91" exitCode=0 Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.450714 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/crc-debug-tnkmc" event={"ID":"fe39de0a-5da9-4f39-8ce9-17edcd52cb18","Type":"ContainerDied","Data":"68f18086539432ba629af602580f8bc4bdbe815663cf8214bcb2a781aed63d91"} Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.451032 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/crc-debug-tnkmc" event={"ID":"fe39de0a-5da9-4f39-8ce9-17edcd52cb18","Type":"ContainerStarted","Data":"54c195c86948dbf0481930bca859598d0dfa13f07916e69a4e305034dcfdad95"} Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.488296 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4nw95/crc-debug-tnkmc"] Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.497428 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4nw95/crc-debug-tnkmc"] Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.724740 4907 scope.go:117] "RemoveContainer" containerID="193b596a7344af104290af45e72f7f48b65caa28f7649fe09b5fdc3b53b1e6d3" Oct 09 20:33:42 crc kubenswrapper[4907]: I1009 20:33:42.768203 4907 scope.go:117] "RemoveContainer" containerID="257078e1e4b33fbdb782fb89a256fe94d37bcc73125b7c72814d2eedcdc0e762" Oct 09 20:33:43 crc kubenswrapper[4907]: I1009 20:33:43.601482 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:43 crc kubenswrapper[4907]: I1009 20:33:43.646694 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-host\") pod \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\" (UID: \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\") " Oct 09 20:33:43 crc kubenswrapper[4907]: I1009 20:33:43.646837 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-host" (OuterVolumeSpecName: "host") pod "fe39de0a-5da9-4f39-8ce9-17edcd52cb18" (UID: "fe39de0a-5da9-4f39-8ce9-17edcd52cb18"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 20:33:43 crc kubenswrapper[4907]: I1009 20:33:43.646887 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkltp\" (UniqueName: \"kubernetes.io/projected/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-kube-api-access-jkltp\") pod \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\" (UID: \"fe39de0a-5da9-4f39-8ce9-17edcd52cb18\") " Oct 09 20:33:43 crc kubenswrapper[4907]: I1009 20:33:43.647517 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-host\") on node \"crc\" DevicePath \"\"" Oct 09 20:33:43 crc kubenswrapper[4907]: I1009 20:33:43.653487 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-kube-api-access-jkltp" (OuterVolumeSpecName: "kube-api-access-jkltp") pod "fe39de0a-5da9-4f39-8ce9-17edcd52cb18" (UID: "fe39de0a-5da9-4f39-8ce9-17edcd52cb18"). InnerVolumeSpecName "kube-api-access-jkltp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:33:43 crc kubenswrapper[4907]: I1009 20:33:43.749775 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkltp\" (UniqueName: \"kubernetes.io/projected/fe39de0a-5da9-4f39-8ce9-17edcd52cb18-kube-api-access-jkltp\") on node \"crc\" DevicePath \"\"" Oct 09 20:33:44 crc kubenswrapper[4907]: I1009 20:33:44.483124 4907 scope.go:117] "RemoveContainer" containerID="68f18086539432ba629af602580f8bc4bdbe815663cf8214bcb2a781aed63d91" Oct 09 20:33:44 crc kubenswrapper[4907]: I1009 20:33:44.483174 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/crc-debug-tnkmc" Oct 09 20:33:45 crc kubenswrapper[4907]: I1009 20:33:45.165005 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe39de0a-5da9-4f39-8ce9-17edcd52cb18" path="/var/lib/kubelet/pods/fe39de0a-5da9-4f39-8ce9-17edcd52cb18/volumes" Oct 09 20:33:48 crc kubenswrapper[4907]: I1009 20:33:48.151757 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:33:48 crc kubenswrapper[4907]: E1009 20:33:48.153653 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:34:02 crc kubenswrapper[4907]: I1009 20:34:02.151485 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:34:02 crc kubenswrapper[4907]: E1009 20:34:02.152325 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v2wbt_openshift-machine-config-operator(717141fe-c68d-4844-ad99-872d296a6370)\"" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" Oct 09 20:34:09 crc kubenswrapper[4907]: I1009 20:34:09.472737 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee/init-config-reloader/0.log" Oct 09 20:34:09 crc kubenswrapper[4907]: I1009 20:34:09.687445 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee/init-config-reloader/0.log" Oct 09 20:34:09 crc kubenswrapper[4907]: I1009 20:34:09.723745 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee/config-reloader/0.log" Oct 09 20:34:09 crc kubenswrapper[4907]: I1009 20:34:09.741336 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_2c2c1206-5a3b-4d9f-954f-a42d6c6ef0ee/alertmanager/0.log" Oct 09 20:34:09 crc kubenswrapper[4907]: I1009 20:34:09.911987 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbc76cff8-xmlsw_f9a93848-dc0b-480e-9ec9-fc16c88e00dc/barbican-api/0.log" Oct 09 20:34:09 crc kubenswrapper[4907]: I1009 20:34:09.916401 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansible-tests-cloudkitty-s00-cloudkitty_92e389b9-a749-4f7e-9c0a-3c901329ff51/ansible-tests-cloudkitty/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.003036 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbc76cff8-xmlsw_f9a93848-dc0b-480e-9ec9-fc16c88e00dc/barbican-api-log/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.102725 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7484f7b746-btdlm_287de68c-1c57-4f07-ba04-4d0899b26673/barbican-keystone-listener/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.237414 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7484f7b746-btdlm_287de68c-1c57-4f07-ba04-4d0899b26673/barbican-keystone-listener-log/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.327591 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5db85b5857-58t94_856ed5f9-dc9b-43db-9d28-b1e400d25798/barbican-worker/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.366350 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5db85b5857-58t94_856ed5f9-dc9b-43db-9d28-b1e400d25798/barbican-worker-log/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.462923 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rbdjt_0fc7b172-694d-4880-a68f-15ba2460d816/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.608630 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_38c4b2fd-cdaf-4217-a934-3210e6eb4f80/ceilometer-central-agent/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.657657 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_38c4b2fd-cdaf-4217-a934-3210e6eb4f80/ceilometer-notification-agent/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.746008 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_38c4b2fd-cdaf-4217-a934-3210e6eb4f80/proxy-httpd/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.799835 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_38c4b2fd-cdaf-4217-a934-3210e6eb4f80/sg-core/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.889606 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_727d6b58-cf41-4fe0-bf13-5a5a82fe2747/cinder-api/0.log" Oct 09 20:34:10 crc kubenswrapper[4907]: I1009 20:34:10.947418 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_727d6b58-cf41-4fe0-bf13-5a5a82fe2747/cinder-api-log/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.142572 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dc694291-2c4e-4bdf-b00c-4025d2018e96/probe/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.173849 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_dc694291-2c4e-4bdf-b00c-4025d2018e96/cinder-scheduler/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.303247 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9/cloudkitty-api-log/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.339182 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_9f1e0ea2-e36b-47e4-8fe3-5e8e799e20d9/cloudkitty-api/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.369655 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_246ca210-2f65-4612-a7ac-dc4e206dd6f0/loki-compactor/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.529104 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-56cd74f89f-gl9k6_be0e2e98-8462-4ac1-bcdb-ed76c24fb1d3/loki-distributor/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.615204 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-d5l99_d03409bd-dfae-4397-bd24-55c925ce4d25/gateway/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.764967 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-76cc998948-n4njd_dc87ab00-6151-4b9a-828b-b7fab2987f4e/gateway/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.850109 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_2f1d45fa-edcc-4ab0-a435-26fce79f5607/loki-index-gateway/0.log" Oct 09 20:34:11 crc kubenswrapper[4907]: I1009 20:34:11.958814 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_bbc1d6e4-0e4e-48bf-b98f-c704a19a16c0/loki-ingester/0.log" Oct 09 20:34:12 crc kubenswrapper[4907]: I1009 20:34:12.031201 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-68bbd7984c-gtcgb_8591c06c-4ad0-41b1-b62f-ea21f97f50a4/loki-querier/0.log" Oct 09 20:34:12 crc kubenswrapper[4907]: I1009 20:34:12.156640 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-779849886d-s66fd_11282a94-310a-44d0-8edd-8a49d8050096/loki-query-frontend/0.log" Oct 09 20:34:12 crc kubenswrapper[4907]: I1009 20:34:12.373850 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-p9ngx_4f6c717a-ca37-4879-babe-36221d9580fa/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:12 crc kubenswrapper[4907]: I1009 20:34:12.473221 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kkj5s_f92986eb-ccf3-4d54-a1e6-5f4168a4bab9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:12 crc kubenswrapper[4907]: I1009 20:34:12.664386 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-w65sw_25257dc2-dcd5-4771-b24d-94e98cd6d8a1/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:12 crc kubenswrapper[4907]: I1009 20:34:12.740359 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gbnql_1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e/init/0.log" Oct 09 20:34:12 crc kubenswrapper[4907]: I1009 20:34:12.948328 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gbnql_1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e/init/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.002096 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-zcbrf_5594394b-d72c-4541-ba69-6342110d2b3a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.029860 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-gbnql_1ff92e37-5fad-4bc3-954f-4cf7cc3f6b9e/dnsmasq-dns/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.376186 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ce1c21df-d6fe-46f5-b959-8c720f7b4fcb/glance-log/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.467924 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ce1c21df-d6fe-46f5-b959-8c720f7b4fcb/glance-httpd/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.488799 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_4c613c00-9feb-4432-9d03-b980178cbe26/cloudkitty-proc/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.576389 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6bc9d4bf-354f-45c7-8116-6119f3f78b0c/glance-log/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.584824 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6bc9d4bf-354f-45c7-8116-6119f3f78b0c/glance-httpd/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.690597 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-rf68v_77c2e1e4-f07b-4a72-b68d-661856abd621/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.805274 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pqds4_3ae57e09-d45e-4b45-a40e-a52a7f6cf0fd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:13 crc kubenswrapper[4907]: I1009 20:34:13.966389 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29334001-d7wk2_e44476e3-d4b7-4c73-a478-9860df9f1d22/keystone-cron/0.log" Oct 09 20:34:14 crc kubenswrapper[4907]: I1009 20:34:14.111012 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-799f6b8dfc-hssd9_c720ed9b-76d3-441e-a0b6-81170e63f46f/keystone-api/0.log" Oct 09 20:34:14 crc kubenswrapper[4907]: I1009 20:34:14.134818 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-q8kh6_313461ee-e16e-42e8-97ef-5e2d16f23cb5/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:14 crc kubenswrapper[4907]: I1009 20:34:14.475870 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68f5dff589-gkl29_23247623-e419-4c41-a5dd-1ec60cdc8ccd/neutron-httpd/0.log" Oct 09 20:34:14 crc kubenswrapper[4907]: I1009 20:34:14.543669 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68f5dff589-gkl29_23247623-e419-4c41-a5dd-1ec60cdc8ccd/neutron-api/0.log" Oct 09 20:34:14 crc kubenswrapper[4907]: I1009 20:34:14.550347 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5qddp_1830993a-457e-4730-a805-fa14152f2824/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:15 crc kubenswrapper[4907]: I1009 20:34:15.045947 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_83277287-28b0-43e3-98e7-e8367e7a87d9/nova-api-log/0.log" Oct 09 20:34:15 crc kubenswrapper[4907]: I1009 20:34:15.161094 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:34:15 crc kubenswrapper[4907]: I1009 20:34:15.195747 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_92173588-8d80-440e-9dd4-62b132d5abed/nova-cell0-conductor-conductor/0.log" Oct 09 20:34:15 crc kubenswrapper[4907]: I1009 20:34:15.498728 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3dbc636b-ea8b-4e61-bce8-2d6aaae5d855/nova-cell1-conductor-conductor/0.log" Oct 09 20:34:15 crc kubenswrapper[4907]: I1009 20:34:15.568022 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_01275002-ecaa-441e-b1a1-035dd770cb1d/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 20:34:15 crc kubenswrapper[4907]: I1009 20:34:15.605260 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_83277287-28b0-43e3-98e7-e8367e7a87d9/nova-api-api/0.log" Oct 09 20:34:16 crc kubenswrapper[4907]: I1009 20:34:16.272201 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-mj6zm_cf1c00bc-7815-4bf7-8c42-d85c38936b4b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:16 crc kubenswrapper[4907]: I1009 20:34:16.391868 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_33bc0e25-b33d-4af0-b735-cac7deff34eb/nova-metadata-log/0.log" Oct 09 20:34:16 crc kubenswrapper[4907]: I1009 20:34:16.809268 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92026239-8122-4224-ae55-be69f2c42a77/mysql-bootstrap/0.log" Oct 09 20:34:16 crc kubenswrapper[4907]: I1009 20:34:16.824327 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f087483f-1925-46d1-a58d-c7cf2354fbb1/nova-scheduler-scheduler/0.log" Oct 09 20:34:16 crc kubenswrapper[4907]: I1009 20:34:16.832128 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"9f5c4e292f4751d77becab3ec7da97c8f8af17a12dcbc6e8afb0dc4aff3a325c"} Oct 09 20:34:17 crc kubenswrapper[4907]: I1009 20:34:17.037720 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92026239-8122-4224-ae55-be69f2c42a77/mysql-bootstrap/0.log" Oct 09 20:34:17 crc kubenswrapper[4907]: I1009 20:34:17.082224 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_92026239-8122-4224-ae55-be69f2c42a77/galera/0.log" Oct 09 20:34:17 crc kubenswrapper[4907]: I1009 20:34:17.262136 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_892437ea-977d-434a-ba03-2ce726fb21b0/mysql-bootstrap/0.log" Oct 09 20:34:17 crc kubenswrapper[4907]: I1009 20:34:17.770419 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_892437ea-977d-434a-ba03-2ce726fb21b0/mysql-bootstrap/0.log" Oct 09 20:34:17 crc kubenswrapper[4907]: I1009 20:34:17.814423 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_33bc0e25-b33d-4af0-b735-cac7deff34eb/nova-metadata-metadata/0.log" Oct 09 20:34:17 crc kubenswrapper[4907]: I1009 20:34:17.829728 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_892437ea-977d-434a-ba03-2ce726fb21b0/galera/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.065622 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_cd3f8f7d-d0f5-4719-a490-d823cf3c8b23/openstackclient/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.096593 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dz7f2_c9bf7943-cd49-4a26-83e2-9efc4c9dcc02/ovn-controller/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.258957 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fhw7x_ed4e1f67-4cb8-4d46-823b-fb81e37d63c1/openstack-network-exporter/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.414094 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9z259_c0f67e81-b9c9-419e-bc68-dcc44ac15f4d/ovsdb-server-init/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.679116 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9z259_c0f67e81-b9c9-419e-bc68-dcc44ac15f4d/ovsdb-server-init/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.714609 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9z259_c0f67e81-b9c9-419e-bc68-dcc44ac15f4d/ovsdb-server/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.718378 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9z259_c0f67e81-b9c9-419e-bc68-dcc44ac15f4d/ovs-vswitchd/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.938995 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_97065ead-95b6-46d3-ab20-a073f6b5f243/openstack-network-exporter/0.log" Oct 09 20:34:18 crc kubenswrapper[4907]: I1009 20:34:18.952815 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nxtwk_6a6607e9-2440-4d12-8649-28e484f86815/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.075755 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_97065ead-95b6-46d3-ab20-a073f6b5f243/ovn-northd/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.128484 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9c18daae-d993-4ac8-954d-f8c38cacedd1/openstack-network-exporter/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.246765 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9c18daae-d993-4ac8-954d-f8c38cacedd1/ovsdbserver-nb/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.378824 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f1723ff9-c43f-463d-903c-11f9b38519e2/openstack-network-exporter/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.585213 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f1723ff9-c43f-463d-903c-11f9b38519e2/ovsdbserver-sb/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.649855 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9c9b847d4-2fhz2_ef3a3e5f-5651-4db7-975d-9ee766a36485/placement-api/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.787540 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9c9b847d4-2fhz2_ef3a3e5f-5651-4db7-975d-9ee766a36485/placement-log/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.801687 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/init-config-reloader/0.log" Oct 09 20:34:19 crc kubenswrapper[4907]: I1009 20:34:19.989637 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/config-reloader/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.008473 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/prometheus/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.016957 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/thanos-sidecar/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.076459 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_89bdfac5-05c6-427c-bf5e-786017f9dd26/init-config-reloader/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.261552 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4e022a77-723b-47bc-98c5-ad7c72aab0c3/setup-container/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.431009 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4e022a77-723b-47bc-98c5-ad7c72aab0c3/setup-container/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.436886 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4e022a77-723b-47bc-98c5-ad7c72aab0c3/rabbitmq/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.495865 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_274be987-64b1-4406-9f04-c81fe651d851/setup-container/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.718875 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_274be987-64b1-4406-9f04-c81fe651d851/setup-container/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.771572 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_274be987-64b1-4406-9f04-c81fe651d851/rabbitmq/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.792101 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-wrxwx_919363ff-2e8b-4837-828e-b5d15a180260/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:20 crc kubenswrapper[4907]: I1009 20:34:20.919590 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-8zclv_0de1fb87-8f71-4b57-af90-3568d238da35/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.051162 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kc9hf_82707ebf-1ae1-4a8e-b3a3-bff2e91e707a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.183089 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7b6rw_11f08769-69d8-4b65-8684-c132bd006797/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.311172 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-g4gz2_68544a9c-d9f1-42c1-8499-83289992b246/ssh-known-hosts-edpm-deployment/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.535814 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78d77d8c99-ljb4q_6b2c2269-5cb3-4bf8-a162-e6a11531eca4/proxy-server/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.584421 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78d77d8c99-ljb4q_6b2c2269-5cb3-4bf8-a162-e6a11531eca4/proxy-httpd/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.701508 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-b2l5z_bba198fe-5d7c-4f5c-a820-ddf9978aed83/swift-ring-rebalance/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.746133 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/account-auditor/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.803152 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/account-reaper/0.log" Oct 09 20:34:21 crc kubenswrapper[4907]: I1009 20:34:21.975220 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/account-replicator/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.000814 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/account-server/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.009168 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/container-auditor/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.077013 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/container-replicator/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.194389 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/container-server/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.241233 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-auditor/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.243231 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/container-updater/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.379201 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-expirer/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.402742 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-replicator/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.429240 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-server/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.483926 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/object-updater/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.612702 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/rsync/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.676521 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fd0a266f-be5f-4162-87fb-7389f11c37ab/swift-recon-cron/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.760678 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9kbwx_40270666-7351-4172-b5d9-c523b405ae52/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:22 crc kubenswrapper[4907]: I1009 20:34:22.911185 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1049514e-2b9c-426e-9534-677c595d39d8/tempest-tests-tempest-tests-runner/0.log" Oct 09 20:34:23 crc kubenswrapper[4907]: I1009 20:34:23.021498 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a5f99a22-52f0-4435-b5d3-a5bd2c675b50/test-operator-logs-container/0.log" Oct 09 20:34:23 crc kubenswrapper[4907]: I1009 20:34:23.129084 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-r42b2_05a8f3e8-9742-4c16-a3a1-2695034bf94d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 20:34:28 crc kubenswrapper[4907]: I1009 20:34:28.661800 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_51120f59-aafb-4105-b919-fe8e4fc20f93/memcached/0.log" Oct 09 20:34:46 crc kubenswrapper[4907]: I1009 20:34:46.873239 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-jzjwr_396b2bde-8328-4285-81a3-58d361096cf8/kube-rbac-proxy/0.log" Oct 09 20:34:46 crc kubenswrapper[4907]: I1009 20:34:46.968172 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-jzjwr_396b2bde-8328-4285-81a3-58d361096cf8/manager/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.099949 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-5pcmk_b1701060-cf14-4dfc-9545-5b63be29728a/kube-rbac-proxy/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.186713 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-5pcmk_b1701060-cf14-4dfc-9545-5b63be29728a/manager/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.268522 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/util/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.441082 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/util/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.473628 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/pull/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.505101 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/pull/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.656376 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/util/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.663154 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/extract/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.668784 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d3b466a2d892e0bc519a56b57fee79b52de74e2d4f2dc4b92d7806c3fa8ssh9_df6f4e7a-b9be-4c16-a7b3-2f3bffa4d4c3/pull/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.855008 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-tpg5p_cdc3d576-f3e6-4016-8856-ff8e5e6cf299/kube-rbac-proxy/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.861642 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-5d6rr_1353b956-2119-4690-be09-9f9b788737a5/manager/0.log" Oct 09 20:34:47 crc kubenswrapper[4907]: I1009 20:34:47.890522 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-5d6rr_1353b956-2119-4690-be09-9f9b788737a5/kube-rbac-proxy/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.057841 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-d2zsg_aa1daa5a-4e9e-4378-81ad-0dab2895f34a/manager/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.097128 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-d2zsg_aa1daa5a-4e9e-4378-81ad-0dab2895f34a/kube-rbac-proxy/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.097854 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-tpg5p_cdc3d576-f3e6-4016-8856-ff8e5e6cf299/manager/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.284337 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-595rv_8c17b476-94f3-4391-a755-e816a5ed56e0/kube-rbac-proxy/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.319374 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-595rv_8c17b476-94f3-4391-a755-e816a5ed56e0/manager/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.511038 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-zwj6t_5870b9a9-c7a2-4e57-b917-e5a41c20dc55/kube-rbac-proxy/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.598336 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-khm2s_71138822-c6a1-4657-a640-9350e6e6965c/kube-rbac-proxy/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.611349 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-zwj6t_5870b9a9-c7a2-4e57-b917-e5a41c20dc55/manager/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.713345 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-khm2s_71138822-c6a1-4657-a640-9350e6e6965c/manager/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.781591 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hjhwf_364dc10d-b5b4-4c0e-a480-7dc371fc6a0d/kube-rbac-proxy/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.837530 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hjhwf_364dc10d-b5b4-4c0e-a480-7dc371fc6a0d/manager/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.962062 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-jnbdz_9497c9e0-df89-48ae-be07-7df3e532bb35/manager/0.log" Oct 09 20:34:48 crc kubenswrapper[4907]: I1009 20:34:48.965217 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-jnbdz_9497c9e0-df89-48ae-be07-7df3e532bb35/kube-rbac-proxy/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.134312 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-q2flj_e5a81d4d-968e-43b0-b53a-e5c475773a29/kube-rbac-proxy/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.146725 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-q2flj_e5a81d4d-968e-43b0-b53a-e5c475773a29/manager/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.240758 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-t46vt_759e961c-957a-436b-80cd-14294fce30ad/kube-rbac-proxy/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.394152 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-4bsn7_ddffdb06-43eb-44db-9afa-a56e2c6b467c/kube-rbac-proxy/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.402861 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-t46vt_759e961c-957a-436b-80cd-14294fce30ad/manager/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.499952 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-4bsn7_ddffdb06-43eb-44db-9afa-a56e2c6b467c/manager/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.610096 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-z5klh_10b627f1-74be-41c8-a7e7-367beb0a828d/kube-rbac-proxy/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.622453 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-z5klh_10b627f1-74be-41c8-a7e7-367beb0a828d/manager/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.760149 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6_523bbf58-dcf0-49f5-a198-24878c574c70/kube-rbac-proxy/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.824263 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dhsqt6_523bbf58-dcf0-49f5-a198-24878c574c70/manager/0.log" Oct 09 20:34:49 crc kubenswrapper[4907]: I1009 20:34:49.966106 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d899f9cc7-5mhzg_385f8c3a-5a7a-4214-a8cf-9c6886264ea9/kube-rbac-proxy/0.log" Oct 09 20:34:50 crc kubenswrapper[4907]: I1009 20:34:50.077070 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-88fd6dc46-qtcmh_715d1a2f-5bf3-4ef7-9086-c1f450daa6eb/kube-rbac-proxy/0.log" Oct 09 20:34:50 crc kubenswrapper[4907]: I1009 20:34:50.350486 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9pktm_abdc2315-d020-4dc6-901d-75db1c33254f/registry-server/0.log" Oct 09 20:34:50 crc kubenswrapper[4907]: I1009 20:34:50.418287 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-88fd6dc46-qtcmh_715d1a2f-5bf3-4ef7-9086-c1f450daa6eb/operator/0.log" Oct 09 20:34:50 crc kubenswrapper[4907]: I1009 20:34:50.582430 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-v87t7_64d8141d-db49-44dd-90bc-20b75a642c99/kube-rbac-proxy/0.log" Oct 09 20:34:50 crc kubenswrapper[4907]: I1009 20:34:50.764221 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-gbxbh_94e5bb04-8b14-4518-846b-721c24bc2348/kube-rbac-proxy/0.log" Oct 09 20:34:50 crc kubenswrapper[4907]: I1009 20:34:50.833832 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-v87t7_64d8141d-db49-44dd-90bc-20b75a642c99/manager/0.log" Oct 09 20:34:50 crc kubenswrapper[4907]: I1009 20:34:50.907307 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-gbxbh_94e5bb04-8b14-4518-846b-721c24bc2348/manager/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.049037 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-ztbz5_9ca2b641-57af-45b8-b0aa-3b45b08d13a7/operator/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.163239 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d899f9cc7-5mhzg_385f8c3a-5a7a-4214-a8cf-9c6886264ea9/manager/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.170368 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-8ztd5_e51f1489-6999-474e-9ae4-5f8598e608d0/kube-rbac-proxy/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.246665 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-8ztd5_e51f1489-6999-474e-9ae4-5f8598e608d0/manager/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.248201 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-895c94468-xtfng_7150c799-4a61-4c14-9471-99fbc61a8f7b/kube-rbac-proxy/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.430207 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-9g4cj_1313e0f0-b372-43cd-8f32-7c6bd566ab1a/kube-rbac-proxy/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.482208 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-9g4cj_1313e0f0-b372-43cd-8f32-7c6bd566ab1a/manager/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.558255 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-895c94468-xtfng_7150c799-4a61-4c14-9471-99fbc61a8f7b/manager/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.582363 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-p5c55_5681fed3-74e3-4e40-beff-cebbe06023e4/kube-rbac-proxy/0.log" Oct 09 20:34:51 crc kubenswrapper[4907]: I1009 20:34:51.646506 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-p5c55_5681fed3-74e3-4e40-beff-cebbe06023e4/manager/0.log" Oct 09 20:35:07 crc kubenswrapper[4907]: I1009 20:35:07.996299 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-q95gm_02b8e549-abd9-4adb-a77a-f2af6305625a/control-plane-machine-set-operator/0.log" Oct 09 20:35:08 crc kubenswrapper[4907]: I1009 20:35:08.182340 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lqqp7_4150c40f-0b19-4f81-b11c-6b19b25922b1/kube-rbac-proxy/0.log" Oct 09 20:35:08 crc kubenswrapper[4907]: I1009 20:35:08.235759 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lqqp7_4150c40f-0b19-4f81-b11c-6b19b25922b1/machine-api-operator/0.log" Oct 09 20:35:19 crc kubenswrapper[4907]: I1009 20:35:19.626826 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-gp6g8_9000b916-e219-410f-8e0c-29d959f4527b/cert-manager-controller/0.log" Oct 09 20:35:19 crc kubenswrapper[4907]: I1009 20:35:19.724673 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-jxzvd_023075d5-e7bd-49f9-876a-d728fa5d66ce/cert-manager-cainjector/0.log" Oct 09 20:35:19 crc kubenswrapper[4907]: I1009 20:35:19.793727 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-kmxnd_2fed30c7-d2fb-4a70-ae72-6dd33133aa94/cert-manager-webhook/0.log" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.525397 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pzgr7"] Oct 09 20:35:27 crc kubenswrapper[4907]: E1009 20:35:27.526587 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe39de0a-5da9-4f39-8ce9-17edcd52cb18" containerName="container-00" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.526608 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe39de0a-5da9-4f39-8ce9-17edcd52cb18" containerName="container-00" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.526886 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe39de0a-5da9-4f39-8ce9-17edcd52cb18" containerName="container-00" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.528800 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.538569 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzgr7"] Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.590171 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62nhj\" (UniqueName: \"kubernetes.io/projected/7845796e-4db5-453a-90e3-3f013a6f6530-kube-api-access-62nhj\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.590305 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-utilities\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.590510 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-catalog-content\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.692284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-catalog-content\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.692386 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62nhj\" (UniqueName: \"kubernetes.io/projected/7845796e-4db5-453a-90e3-3f013a6f6530-kube-api-access-62nhj\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.692484 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-utilities\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.692936 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-utilities\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.693093 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-catalog-content\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.720520 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62nhj\" (UniqueName: \"kubernetes.io/projected/7845796e-4db5-453a-90e3-3f013a6f6530-kube-api-access-62nhj\") pod \"redhat-marketplace-pzgr7\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:27 crc kubenswrapper[4907]: I1009 20:35:27.864926 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:28 crc kubenswrapper[4907]: I1009 20:35:28.380102 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzgr7"] Oct 09 20:35:28 crc kubenswrapper[4907]: W1009 20:35:28.387307 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7845796e_4db5_453a_90e3_3f013a6f6530.slice/crio-c40619008520b34813c8982fb67a21a2c7fe356fdc624a5455948f9bab5c7412 WatchSource:0}: Error finding container c40619008520b34813c8982fb67a21a2c7fe356fdc624a5455948f9bab5c7412: Status 404 returned error can't find the container with id c40619008520b34813c8982fb67a21a2c7fe356fdc624a5455948f9bab5c7412 Oct 09 20:35:28 crc kubenswrapper[4907]: I1009 20:35:28.575361 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzgr7" event={"ID":"7845796e-4db5-453a-90e3-3f013a6f6530","Type":"ContainerStarted","Data":"c40619008520b34813c8982fb67a21a2c7fe356fdc624a5455948f9bab5c7412"} Oct 09 20:35:29 crc kubenswrapper[4907]: I1009 20:35:29.588079 4907 generic.go:334] "Generic (PLEG): container finished" podID="7845796e-4db5-453a-90e3-3f013a6f6530" containerID="13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087" exitCode=0 Oct 09 20:35:29 crc kubenswrapper[4907]: I1009 20:35:29.588141 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzgr7" event={"ID":"7845796e-4db5-453a-90e3-3f013a6f6530","Type":"ContainerDied","Data":"13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087"} Oct 09 20:35:29 crc kubenswrapper[4907]: I1009 20:35:29.590733 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 20:35:30 crc kubenswrapper[4907]: I1009 20:35:30.599500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzgr7" event={"ID":"7845796e-4db5-453a-90e3-3f013a6f6530","Type":"ContainerStarted","Data":"18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c"} Oct 09 20:35:31 crc kubenswrapper[4907]: I1009 20:35:31.613693 4907 generic.go:334] "Generic (PLEG): container finished" podID="7845796e-4db5-453a-90e3-3f013a6f6530" containerID="18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c" exitCode=0 Oct 09 20:35:31 crc kubenswrapper[4907]: I1009 20:35:31.613762 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzgr7" event={"ID":"7845796e-4db5-453a-90e3-3f013a6f6530","Type":"ContainerDied","Data":"18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c"} Oct 09 20:35:31 crc kubenswrapper[4907]: I1009 20:35:31.926659 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-2pltk_d33978d9-506b-49de-ad7c-d4fd0cb80c79/nmstate-console-plugin/0.log" Oct 09 20:35:32 crc kubenswrapper[4907]: I1009 20:35:32.139944 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-92f76_a9bc1d23-a1e5-4879-ad7d-635639d6cb12/nmstate-handler/0.log" Oct 09 20:35:32 crc kubenswrapper[4907]: I1009 20:35:32.244043 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pfsd4_bda49bd2-44dc-4a59-becb-c3942059ab4d/kube-rbac-proxy/0.log" Oct 09 20:35:32 crc kubenswrapper[4907]: I1009 20:35:32.264074 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pfsd4_bda49bd2-44dc-4a59-becb-c3942059ab4d/nmstate-metrics/0.log" Oct 09 20:35:32 crc kubenswrapper[4907]: I1009 20:35:32.390620 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-2g9nb_9c9331a0-3676-4106-957f-5699d256f0d6/nmstate-operator/0.log" Oct 09 20:35:32 crc kubenswrapper[4907]: I1009 20:35:32.533792 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-n97k6_71ed6ce6-21bd-4132-9ba3-d344520de4a9/nmstate-webhook/0.log" Oct 09 20:35:32 crc kubenswrapper[4907]: I1009 20:35:32.624992 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzgr7" event={"ID":"7845796e-4db5-453a-90e3-3f013a6f6530","Type":"ContainerStarted","Data":"8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c"} Oct 09 20:35:32 crc kubenswrapper[4907]: I1009 20:35:32.640944 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pzgr7" podStartSLOduration=3.089768446 podStartE2EDuration="5.64092981s" podCreationTimestamp="2025-10-09 20:35:27 +0000 UTC" firstStartedPulling="2025-10-09 20:35:29.590502622 +0000 UTC m=+4015.122470111" lastFinishedPulling="2025-10-09 20:35:32.141663986 +0000 UTC m=+4017.673631475" observedRunningTime="2025-10-09 20:35:32.639528655 +0000 UTC m=+4018.171496154" watchObservedRunningTime="2025-10-09 20:35:32.64092981 +0000 UTC m=+4018.172897299" Oct 09 20:35:37 crc kubenswrapper[4907]: I1009 20:35:37.865879 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:37 crc kubenswrapper[4907]: I1009 20:35:37.866565 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:37 crc kubenswrapper[4907]: I1009 20:35:37.938891 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:38 crc kubenswrapper[4907]: I1009 20:35:38.758792 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:38 crc kubenswrapper[4907]: I1009 20:35:38.815304 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzgr7"] Oct 09 20:35:40 crc kubenswrapper[4907]: I1009 20:35:40.727326 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pzgr7" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" containerName="registry-server" containerID="cri-o://8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c" gracePeriod=2 Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.442915 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.472130 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-utilities\") pod \"7845796e-4db5-453a-90e3-3f013a6f6530\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.472292 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-catalog-content\") pod \"7845796e-4db5-453a-90e3-3f013a6f6530\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.472567 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62nhj\" (UniqueName: \"kubernetes.io/projected/7845796e-4db5-453a-90e3-3f013a6f6530-kube-api-access-62nhj\") pod \"7845796e-4db5-453a-90e3-3f013a6f6530\" (UID: \"7845796e-4db5-453a-90e3-3f013a6f6530\") " Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.473686 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-utilities" (OuterVolumeSpecName: "utilities") pod "7845796e-4db5-453a-90e3-3f013a6f6530" (UID: "7845796e-4db5-453a-90e3-3f013a6f6530"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.501696 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7845796e-4db5-453a-90e3-3f013a6f6530" (UID: "7845796e-4db5-453a-90e3-3f013a6f6530"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.509371 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7845796e-4db5-453a-90e3-3f013a6f6530-kube-api-access-62nhj" (OuterVolumeSpecName: "kube-api-access-62nhj") pod "7845796e-4db5-453a-90e3-3f013a6f6530" (UID: "7845796e-4db5-453a-90e3-3f013a6f6530"). InnerVolumeSpecName "kube-api-access-62nhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.575510 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62nhj\" (UniqueName: \"kubernetes.io/projected/7845796e-4db5-453a-90e3-3f013a6f6530-kube-api-access-62nhj\") on node \"crc\" DevicePath \"\"" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.575821 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.575837 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7845796e-4db5-453a-90e3-3f013a6f6530-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.744899 4907 generic.go:334] "Generic (PLEG): container finished" podID="7845796e-4db5-453a-90e3-3f013a6f6530" containerID="8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c" exitCode=0 Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.744969 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzgr7" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.744974 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzgr7" event={"ID":"7845796e-4db5-453a-90e3-3f013a6f6530","Type":"ContainerDied","Data":"8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c"} Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.745059 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzgr7" event={"ID":"7845796e-4db5-453a-90e3-3f013a6f6530","Type":"ContainerDied","Data":"c40619008520b34813c8982fb67a21a2c7fe356fdc624a5455948f9bab5c7412"} Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.745084 4907 scope.go:117] "RemoveContainer" containerID="8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.774602 4907 scope.go:117] "RemoveContainer" containerID="18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.784159 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzgr7"] Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.791615 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzgr7"] Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.824951 4907 scope.go:117] "RemoveContainer" containerID="13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.874836 4907 scope.go:117] "RemoveContainer" containerID="8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c" Oct 09 20:35:41 crc kubenswrapper[4907]: E1009 20:35:41.875298 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c\": container with ID starting with 8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c not found: ID does not exist" containerID="8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.875337 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c"} err="failed to get container status \"8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c\": rpc error: code = NotFound desc = could not find container \"8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c\": container with ID starting with 8e47d737a42db36f1219fab584452095a57a2e4e7d51ad8bad996383e401fb5c not found: ID does not exist" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.875361 4907 scope.go:117] "RemoveContainer" containerID="18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c" Oct 09 20:35:41 crc kubenswrapper[4907]: E1009 20:35:41.875655 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c\": container with ID starting with 18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c not found: ID does not exist" containerID="18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.875688 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c"} err="failed to get container status \"18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c\": rpc error: code = NotFound desc = could not find container \"18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c\": container with ID starting with 18d1d9503a88494eb7fcf9c67b9e2c599aa83c9bed3c1f1e18c9e65fc9878f5c not found: ID does not exist" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.875706 4907 scope.go:117] "RemoveContainer" containerID="13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087" Oct 09 20:35:41 crc kubenswrapper[4907]: E1009 20:35:41.875965 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087\": container with ID starting with 13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087 not found: ID does not exist" containerID="13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087" Oct 09 20:35:41 crc kubenswrapper[4907]: I1009 20:35:41.875995 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087"} err="failed to get container status \"13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087\": rpc error: code = NotFound desc = could not find container \"13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087\": container with ID starting with 13d45e2f5eb256be8f3fb5a606b55094471c097d91f41a3c919b3e39764a0087 not found: ID does not exist" Oct 09 20:35:43 crc kubenswrapper[4907]: I1009 20:35:43.167185 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" path="/var/lib/kubelet/pods/7845796e-4db5-453a-90e3-3f013a6f6530/volumes" Oct 09 20:35:44 crc kubenswrapper[4907]: I1009 20:35:44.434908 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6978c7c7cf-jnfh6_584227b0-c217-4ec6-81fc-195bf4da68f3/kube-rbac-proxy/0.log" Oct 09 20:35:44 crc kubenswrapper[4907]: I1009 20:35:44.483279 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6978c7c7cf-jnfh6_584227b0-c217-4ec6-81fc-195bf4da68f3/manager/0.log" Oct 09 20:35:56 crc kubenswrapper[4907]: I1009 20:35:56.853595 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6sl6d_42cf9557-7cae-41c0-bbaa-a3baa099e36c/kube-rbac-proxy/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.025618 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-6sl6d_42cf9557-7cae-41c0-bbaa-a3baa099e36c/controller/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.115541 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-frr-files/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.434702 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-frr-files/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.441653 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-reloader/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.469690 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-metrics/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.532510 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-reloader/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.872320 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-frr-files/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.890672 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-metrics/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.919133 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-reloader/0.log" Oct 09 20:35:57 crc kubenswrapper[4907]: I1009 20:35:57.945404 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-metrics/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.106032 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-frr-files/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.121992 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-reloader/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.130970 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/cp-metrics/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.216085 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/controller/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.351560 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/kube-rbac-proxy/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.373930 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/frr-metrics/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.443721 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/kube-rbac-proxy-frr/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.636725 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/reloader/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.730780 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-2z2vr_98b09a68-cabf-431c-9885-8f6e36c84de6/frr-k8s-webhook-server/0.log" Oct 09 20:35:58 crc kubenswrapper[4907]: I1009 20:35:58.915479 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-55785946f8-t68kr_5e6d0933-34d0-4cf2-bc08-75d11b13e618/manager/0.log" Oct 09 20:35:59 crc kubenswrapper[4907]: I1009 20:35:59.107915 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5c6994cd9d-kw99q_7820e44f-78bd-4549-8f4b-d7f2ec3b2b1b/webhook-server/0.log" Oct 09 20:35:59 crc kubenswrapper[4907]: I1009 20:35:59.243591 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m6wfh_db2ff9e3-9b97-4ec9-8b10-c782c7784b8f/kube-rbac-proxy/0.log" Oct 09 20:35:59 crc kubenswrapper[4907]: I1009 20:35:59.608850 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-km469_ba53143e-9c68-4c9f-89b8-45429be8e899/frr/0.log" Oct 09 20:35:59 crc kubenswrapper[4907]: I1009 20:35:59.790925 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m6wfh_db2ff9e3-9b97-4ec9-8b10-c782c7784b8f/speaker/0.log" Oct 09 20:36:12 crc kubenswrapper[4907]: I1009 20:36:12.681000 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/util/0.log" Oct 09 20:36:12 crc kubenswrapper[4907]: I1009 20:36:12.896994 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/pull/0.log" Oct 09 20:36:12 crc kubenswrapper[4907]: I1009 20:36:12.910716 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/util/0.log" Oct 09 20:36:12 crc kubenswrapper[4907]: I1009 20:36:12.930196 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/pull/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.125057 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/util/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.149027 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/pull/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.204910 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_03c6e0f8bd928fdcaaf530d547155f7eef49635d3e29724a094c0ab694j5jmf_7b681ab0-8c28-47ed-9f9c-77f233a4ad91/extract/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.316380 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/util/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.512829 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/pull/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.519923 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/util/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.525523 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/pull/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.905411 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/util/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.938367 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/extract/0.log" Oct 09 20:36:13 crc kubenswrapper[4907]: I1009 20:36:13.946879 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_142e5edc705b0443a978f15b9d74db4e11d2db1d26a61e7f8c9e49e303tbhjc_20e43860-0c38-4f47-83e6-147765347183/pull/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.114610 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/util/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.253763 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/pull/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.264332 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/pull/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.304943 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/util/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.484283 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/extract/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.489137 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/pull/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.525007 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2pcdvv_36417bf7-a6b9-4677-baff-e04cd0e7f1dd/util/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.660712 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/util/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.814996 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/util/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.865539 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/pull/0.log" Oct 09 20:36:14 crc kubenswrapper[4907]: I1009 20:36:14.875012 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/pull/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.052247 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/util/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.081954 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/extract/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.106931 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d6b8cr_76767f10-290e-430e-890f-cd5e6769c46e/pull/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.263367 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-utilities/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.455939 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-utilities/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.501929 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-content/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.543875 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-content/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.728628 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-utilities/0.log" Oct 09 20:36:15 crc kubenswrapper[4907]: I1009 20:36:15.732822 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/extract-content/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.010815 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-utilities/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.187409 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-utilities/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.214130 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-content/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.306292 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-content/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.465207 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-utilities/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.537704 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b59r4_50afc46f-91ad-47d4-9ef9-03be3cfa2df6/registry-server/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.613833 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/extract-content/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.772847 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/util/0.log" Oct 09 20:36:16 crc kubenswrapper[4907]: I1009 20:36:16.988665 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/util/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.060520 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/pull/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.088155 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/pull/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.215574 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z7tfr_ae0c82bc-f62f-4df0-a8d4-630a2124f553/registry-server/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.228153 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/util/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.267021 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/pull/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.284556 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvtvkh_f26ae800-7648-48e3-a47c-ec626aead3dc/extract/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.410012 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-dmrx4_d043494b-98ab-482a-ba53-5f2445d01bea/marketplace-operator/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.464946 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-utilities/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.622634 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-content/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.650773 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-utilities/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.676413 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-content/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.851119 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-content/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.863986 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/extract-utilities/0.log" Oct 09 20:36:17 crc kubenswrapper[4907]: I1009 20:36:17.912582 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-utilities/0.log" Oct 09 20:36:18 crc kubenswrapper[4907]: I1009 20:36:18.009225 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bw8zg_4b61304d-c2e5-4a83-9eeb-9f6688f8e3b2/registry-server/0.log" Oct 09 20:36:18 crc kubenswrapper[4907]: I1009 20:36:18.134289 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-utilities/0.log" Oct 09 20:36:18 crc kubenswrapper[4907]: I1009 20:36:18.141807 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-content/0.log" Oct 09 20:36:18 crc kubenswrapper[4907]: I1009 20:36:18.152693 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-content/0.log" Oct 09 20:36:18 crc kubenswrapper[4907]: I1009 20:36:18.335629 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-content/0.log" Oct 09 20:36:18 crc kubenswrapper[4907]: I1009 20:36:18.414913 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/extract-utilities/0.log" Oct 09 20:36:18 crc kubenswrapper[4907]: I1009 20:36:18.813681 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5fdv4_ccd02e79-a674-49ff-895e-e691c2a42a17/registry-server/0.log" Oct 09 20:36:30 crc kubenswrapper[4907]: I1009 20:36:30.588692 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-vfrf8_df906780-9fa6-4336-8b74-dd4061587bfe/prometheus-operator/0.log" Oct 09 20:36:30 crc kubenswrapper[4907]: I1009 20:36:30.768740 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f4db65768-4qzmd_d6b76317-83ef-4bab-b4bd-7940ca0c954e/prometheus-operator-admission-webhook/0.log" Oct 09 20:36:30 crc kubenswrapper[4907]: I1009 20:36:30.916604 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7f4db65768-7hnrz_d1d14ebb-33ce-4f94-b224-f267661a1704/prometheus-operator-admission-webhook/0.log" Oct 09 20:36:30 crc kubenswrapper[4907]: I1009 20:36:30.972756 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-qp2p6_8f351fc6-080d-41c9-ab41-44dc032b6579/operator/0.log" Oct 09 20:36:31 crc kubenswrapper[4907]: I1009 20:36:31.131927 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-csbcg_2bb9bd82-399c-43cb-aad5-37832f57ba4f/perses-operator/0.log" Oct 09 20:36:36 crc kubenswrapper[4907]: I1009 20:36:36.299290 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:36:36 crc kubenswrapper[4907]: I1009 20:36:36.299866 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:36:43 crc kubenswrapper[4907]: I1009 20:36:43.488775 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6978c7c7cf-jnfh6_584227b0-c217-4ec6-81fc-195bf4da68f3/kube-rbac-proxy/0.log" Oct 09 20:36:43 crc kubenswrapper[4907]: I1009 20:36:43.546045 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6978c7c7cf-jnfh6_584227b0-c217-4ec6-81fc-195bf4da68f3/manager/0.log" Oct 09 20:37:05 crc kubenswrapper[4907]: E1009 20:37:05.420082 4907 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.104:44056->38.102.83.104:46823: write tcp 38.102.83.104:44056->38.102.83.104:46823: write: broken pipe Oct 09 20:37:05 crc kubenswrapper[4907]: E1009 20:37:05.551611 4907 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.104:44162->38.102.83.104:46823: write tcp 38.102.83.104:44162->38.102.83.104:46823: write: broken pipe Oct 09 20:37:06 crc kubenswrapper[4907]: I1009 20:37:06.299831 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:37:06 crc kubenswrapper[4907]: I1009 20:37:06.300129 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.299359 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.300004 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.300048 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.300791 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f5c4e292f4751d77becab3ec7da97c8f8af17a12dcbc6e8afb0dc4aff3a325c"} pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.300837 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" containerID="cri-o://9f5c4e292f4751d77becab3ec7da97c8f8af17a12dcbc6e8afb0dc4aff3a325c" gracePeriod=600 Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.948029 4907 generic.go:334] "Generic (PLEG): container finished" podID="717141fe-c68d-4844-ad99-872d296a6370" containerID="9f5c4e292f4751d77becab3ec7da97c8f8af17a12dcbc6e8afb0dc4aff3a325c" exitCode=0 Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.948116 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerDied","Data":"9f5c4e292f4751d77becab3ec7da97c8f8af17a12dcbc6e8afb0dc4aff3a325c"} Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.948386 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" event={"ID":"717141fe-c68d-4844-ad99-872d296a6370","Type":"ContainerStarted","Data":"c826d4c150070501c0cd44e10c8c219a469ddab4e69c3460d8f86f75945b325b"} Oct 09 20:37:36 crc kubenswrapper[4907]: I1009 20:37:36.948413 4907 scope.go:117] "RemoveContainer" containerID="8ccdb447fe3699b99010b05a61039b9989be36ed9432208729ca3c2ffe6fc226" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.810801 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wth5z"] Oct 09 20:37:51 crc kubenswrapper[4907]: E1009 20:37:51.812768 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" containerName="extract-utilities" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.812788 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" containerName="extract-utilities" Oct 09 20:37:51 crc kubenswrapper[4907]: E1009 20:37:51.812836 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" containerName="extract-content" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.812845 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" containerName="extract-content" Oct 09 20:37:51 crc kubenswrapper[4907]: E1009 20:37:51.812855 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" containerName="registry-server" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.812863 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" containerName="registry-server" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.813125 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7845796e-4db5-453a-90e3-3f013a6f6530" containerName="registry-server" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.815504 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.841157 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wth5z"] Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.976617 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jptbn\" (UniqueName: \"kubernetes.io/projected/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-kube-api-access-jptbn\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.976683 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-utilities\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:51 crc kubenswrapper[4907]: I1009 20:37:51.977118 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-catalog-content\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:52 crc kubenswrapper[4907]: I1009 20:37:52.079091 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-catalog-content\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:52 crc kubenswrapper[4907]: I1009 20:37:52.079587 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jptbn\" (UniqueName: \"kubernetes.io/projected/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-kube-api-access-jptbn\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:52 crc kubenswrapper[4907]: I1009 20:37:52.079734 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-utilities\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:52 crc kubenswrapper[4907]: I1009 20:37:52.079645 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-catalog-content\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:52 crc kubenswrapper[4907]: I1009 20:37:52.080360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-utilities\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:52 crc kubenswrapper[4907]: I1009 20:37:52.101453 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptbn\" (UniqueName: \"kubernetes.io/projected/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-kube-api-access-jptbn\") pod \"certified-operators-wth5z\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:52 crc kubenswrapper[4907]: I1009 20:37:52.138349 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:37:52 crc kubenswrapper[4907]: I1009 20:37:52.689906 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wth5z"] Oct 09 20:37:53 crc kubenswrapper[4907]: I1009 20:37:53.123919 4907 generic.go:334] "Generic (PLEG): container finished" podID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerID="8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba" exitCode=0 Oct 09 20:37:53 crc kubenswrapper[4907]: I1009 20:37:53.123972 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wth5z" event={"ID":"f8a30acb-35ee-4f44-8dcf-2b19e30e9351","Type":"ContainerDied","Data":"8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba"} Oct 09 20:37:53 crc kubenswrapper[4907]: I1009 20:37:53.124003 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wth5z" event={"ID":"f8a30acb-35ee-4f44-8dcf-2b19e30e9351","Type":"ContainerStarted","Data":"af470ff451ce22ee183c1249e36c27db6c7f7914134f345a759df159c19b7a7b"} Oct 09 20:37:54 crc kubenswrapper[4907]: I1009 20:37:54.134451 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wth5z" event={"ID":"f8a30acb-35ee-4f44-8dcf-2b19e30e9351","Type":"ContainerStarted","Data":"06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6"} Oct 09 20:37:55 crc kubenswrapper[4907]: I1009 20:37:55.164592 4907 generic.go:334] "Generic (PLEG): container finished" podID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerID="06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6" exitCode=0 Oct 09 20:37:55 crc kubenswrapper[4907]: I1009 20:37:55.164838 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wth5z" event={"ID":"f8a30acb-35ee-4f44-8dcf-2b19e30e9351","Type":"ContainerDied","Data":"06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6"} Oct 09 20:37:56 crc kubenswrapper[4907]: I1009 20:37:56.177642 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wth5z" event={"ID":"f8a30acb-35ee-4f44-8dcf-2b19e30e9351","Type":"ContainerStarted","Data":"3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406"} Oct 09 20:37:56 crc kubenswrapper[4907]: I1009 20:37:56.210190 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wth5z" podStartSLOduration=2.5414693 podStartE2EDuration="5.210165528s" podCreationTimestamp="2025-10-09 20:37:51 +0000 UTC" firstStartedPulling="2025-10-09 20:37:53.126165013 +0000 UTC m=+4158.658132502" lastFinishedPulling="2025-10-09 20:37:55.794861231 +0000 UTC m=+4161.326828730" observedRunningTime="2025-10-09 20:37:56.202790894 +0000 UTC m=+4161.734758403" watchObservedRunningTime="2025-10-09 20:37:56.210165528 +0000 UTC m=+4161.742133027" Oct 09 20:38:02 crc kubenswrapper[4907]: I1009 20:38:02.138497 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:38:02 crc kubenswrapper[4907]: I1009 20:38:02.139207 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:38:02 crc kubenswrapper[4907]: I1009 20:38:02.544369 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:38:02 crc kubenswrapper[4907]: I1009 20:38:02.599405 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:38:02 crc kubenswrapper[4907]: I1009 20:38:02.798611 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wth5z"] Oct 09 20:38:04 crc kubenswrapper[4907]: I1009 20:38:04.296621 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wth5z" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerName="registry-server" containerID="cri-o://3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406" gracePeriod=2 Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.020675 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.180736 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jptbn\" (UniqueName: \"kubernetes.io/projected/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-kube-api-access-jptbn\") pod \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.181383 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-catalog-content\") pod \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.181458 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-utilities\") pod \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\" (UID: \"f8a30acb-35ee-4f44-8dcf-2b19e30e9351\") " Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.183945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-utilities" (OuterVolumeSpecName: "utilities") pod "f8a30acb-35ee-4f44-8dcf-2b19e30e9351" (UID: "f8a30acb-35ee-4f44-8dcf-2b19e30e9351"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.186571 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-kube-api-access-jptbn" (OuterVolumeSpecName: "kube-api-access-jptbn") pod "f8a30acb-35ee-4f44-8dcf-2b19e30e9351" (UID: "f8a30acb-35ee-4f44-8dcf-2b19e30e9351"). InnerVolumeSpecName "kube-api-access-jptbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.238745 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8a30acb-35ee-4f44-8dcf-2b19e30e9351" (UID: "f8a30acb-35ee-4f44-8dcf-2b19e30e9351"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.284156 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.284191 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.284200 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jptbn\" (UniqueName: \"kubernetes.io/projected/f8a30acb-35ee-4f44-8dcf-2b19e30e9351-kube-api-access-jptbn\") on node \"crc\" DevicePath \"\"" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.307201 4907 generic.go:334] "Generic (PLEG): container finished" podID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerID="3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406" exitCode=0 Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.307248 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wth5z" event={"ID":"f8a30acb-35ee-4f44-8dcf-2b19e30e9351","Type":"ContainerDied","Data":"3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406"} Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.307280 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wth5z" event={"ID":"f8a30acb-35ee-4f44-8dcf-2b19e30e9351","Type":"ContainerDied","Data":"af470ff451ce22ee183c1249e36c27db6c7f7914134f345a759df159c19b7a7b"} Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.307296 4907 scope.go:117] "RemoveContainer" containerID="3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.308114 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wth5z" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.340708 4907 scope.go:117] "RemoveContainer" containerID="06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.349252 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wth5z"] Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.358808 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wth5z"] Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.372376 4907 scope.go:117] "RemoveContainer" containerID="8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.418673 4907 scope.go:117] "RemoveContainer" containerID="3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406" Oct 09 20:38:05 crc kubenswrapper[4907]: E1009 20:38:05.419176 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406\": container with ID starting with 3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406 not found: ID does not exist" containerID="3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.419228 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406"} err="failed to get container status \"3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406\": rpc error: code = NotFound desc = could not find container \"3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406\": container with ID starting with 3e7f477b5986ffe91d3ec63ef0307670e9b7983588b020212f21cf9d1bdcd406 not found: ID does not exist" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.419255 4907 scope.go:117] "RemoveContainer" containerID="06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6" Oct 09 20:38:05 crc kubenswrapper[4907]: E1009 20:38:05.419608 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6\": container with ID starting with 06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6 not found: ID does not exist" containerID="06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.419646 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6"} err="failed to get container status \"06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6\": rpc error: code = NotFound desc = could not find container \"06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6\": container with ID starting with 06728ad86d0bb2837630a29811a4aff550a6d8ad5be264ede97c8b8086d965a6 not found: ID does not exist" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.419669 4907 scope.go:117] "RemoveContainer" containerID="8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba" Oct 09 20:38:05 crc kubenswrapper[4907]: E1009 20:38:05.420052 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba\": container with ID starting with 8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba not found: ID does not exist" containerID="8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba" Oct 09 20:38:05 crc kubenswrapper[4907]: I1009 20:38:05.420102 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba"} err="failed to get container status \"8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba\": rpc error: code = NotFound desc = could not find container \"8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba\": container with ID starting with 8c7d593867d3d26df077764de22212c80f1310b32f92a3d31156c2f5d43c5fba not found: ID does not exist" Oct 09 20:38:07 crc kubenswrapper[4907]: I1009 20:38:07.179524 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" path="/var/lib/kubelet/pods/f8a30acb-35ee-4f44-8dcf-2b19e30e9351/volumes" Oct 09 20:38:19 crc kubenswrapper[4907]: I1009 20:38:19.481159 4907 generic.go:334] "Generic (PLEG): container finished" podID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerID="d47234031804beb9b6e0605be223a96a9cd0e31108794806a0b6156140b5cb8f" exitCode=0 Oct 09 20:38:19 crc kubenswrapper[4907]: I1009 20:38:19.481227 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4nw95/must-gather-mhx4n" event={"ID":"ae39e144-b0c3-451f-ac6a-d4638f209ed8","Type":"ContainerDied","Data":"d47234031804beb9b6e0605be223a96a9cd0e31108794806a0b6156140b5cb8f"} Oct 09 20:38:19 crc kubenswrapper[4907]: I1009 20:38:19.482607 4907 scope.go:117] "RemoveContainer" containerID="d47234031804beb9b6e0605be223a96a9cd0e31108794806a0b6156140b5cb8f" Oct 09 20:38:20 crc kubenswrapper[4907]: I1009 20:38:20.315455 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4nw95_must-gather-mhx4n_ae39e144-b0c3-451f-ac6a-d4638f209ed8/gather/0.log" Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.164088 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4nw95/must-gather-mhx4n"] Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.164977 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4nw95/must-gather-mhx4n" podUID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerName="copy" containerID="cri-o://9727f825f6da7b4af12dc1fa904927e38ce64af5f4c8398c1c4deeae92af94c8" gracePeriod=2 Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.182114 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4nw95/must-gather-mhx4n"] Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.597618 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4nw95_must-gather-mhx4n_ae39e144-b0c3-451f-ac6a-d4638f209ed8/copy/0.log" Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.598436 4907 generic.go:334] "Generic (PLEG): container finished" podID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerID="9727f825f6da7b4af12dc1fa904927e38ce64af5f4c8398c1c4deeae92af94c8" exitCode=143 Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.793181 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4nw95_must-gather-mhx4n_ae39e144-b0c3-451f-ac6a-d4638f209ed8/copy/0.log" Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.793648 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.931052 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4k5q\" (UniqueName: \"kubernetes.io/projected/ae39e144-b0c3-451f-ac6a-d4638f209ed8-kube-api-access-w4k5q\") pod \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\" (UID: \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\") " Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.931154 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae39e144-b0c3-451f-ac6a-d4638f209ed8-must-gather-output\") pod \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\" (UID: \"ae39e144-b0c3-451f-ac6a-d4638f209ed8\") " Oct 09 20:38:30 crc kubenswrapper[4907]: I1009 20:38:30.937813 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae39e144-b0c3-451f-ac6a-d4638f209ed8-kube-api-access-w4k5q" (OuterVolumeSpecName: "kube-api-access-w4k5q") pod "ae39e144-b0c3-451f-ac6a-d4638f209ed8" (UID: "ae39e144-b0c3-451f-ac6a-d4638f209ed8"). InnerVolumeSpecName "kube-api-access-w4k5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:38:31 crc kubenswrapper[4907]: I1009 20:38:31.058151 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4k5q\" (UniqueName: \"kubernetes.io/projected/ae39e144-b0c3-451f-ac6a-d4638f209ed8-kube-api-access-w4k5q\") on node \"crc\" DevicePath \"\"" Oct 09 20:38:31 crc kubenswrapper[4907]: I1009 20:38:31.135878 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae39e144-b0c3-451f-ac6a-d4638f209ed8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ae39e144-b0c3-451f-ac6a-d4638f209ed8" (UID: "ae39e144-b0c3-451f-ac6a-d4638f209ed8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:38:31 crc kubenswrapper[4907]: I1009 20:38:31.159880 4907 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae39e144-b0c3-451f-ac6a-d4638f209ed8-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 20:38:31 crc kubenswrapper[4907]: I1009 20:38:31.164546 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" path="/var/lib/kubelet/pods/ae39e144-b0c3-451f-ac6a-d4638f209ed8/volumes" Oct 09 20:38:31 crc kubenswrapper[4907]: I1009 20:38:31.611993 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4nw95_must-gather-mhx4n_ae39e144-b0c3-451f-ac6a-d4638f209ed8/copy/0.log" Oct 09 20:38:31 crc kubenswrapper[4907]: I1009 20:38:31.612703 4907 scope.go:117] "RemoveContainer" containerID="9727f825f6da7b4af12dc1fa904927e38ce64af5f4c8398c1c4deeae92af94c8" Oct 09 20:38:31 crc kubenswrapper[4907]: I1009 20:38:31.612838 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4nw95/must-gather-mhx4n" Oct 09 20:38:31 crc kubenswrapper[4907]: I1009 20:38:31.659669 4907 scope.go:117] "RemoveContainer" containerID="d47234031804beb9b6e0605be223a96a9cd0e31108794806a0b6156140b5cb8f" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.257656 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vhtfh"] Oct 09 20:38:50 crc kubenswrapper[4907]: E1009 20:38:50.258608 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerName="registry-server" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.258621 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerName="registry-server" Oct 09 20:38:50 crc kubenswrapper[4907]: E1009 20:38:50.258644 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerName="extract-utilities" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.258650 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerName="extract-utilities" Oct 09 20:38:50 crc kubenswrapper[4907]: E1009 20:38:50.258666 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerName="gather" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.258672 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerName="gather" Oct 09 20:38:50 crc kubenswrapper[4907]: E1009 20:38:50.258682 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerName="copy" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.258688 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerName="copy" Oct 09 20:38:50 crc kubenswrapper[4907]: E1009 20:38:50.258699 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerName="extract-content" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.258705 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerName="extract-content" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.258895 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerName="gather" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.258907 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a30acb-35ee-4f44-8dcf-2b19e30e9351" containerName="registry-server" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.258934 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae39e144-b0c3-451f-ac6a-d4638f209ed8" containerName="copy" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.261376 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.300094 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhtfh"] Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.351105 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgx2l\" (UniqueName: \"kubernetes.io/projected/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-kube-api-access-kgx2l\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.351508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-utilities\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.351580 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-catalog-content\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.453691 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgx2l\" (UniqueName: \"kubernetes.io/projected/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-kube-api-access-kgx2l\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.453855 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-utilities\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.453962 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-catalog-content\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.454552 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-catalog-content\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.454835 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-utilities\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.476344 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgx2l\" (UniqueName: \"kubernetes.io/projected/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-kube-api-access-kgx2l\") pod \"redhat-operators-vhtfh\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:50 crc kubenswrapper[4907]: I1009 20:38:50.593763 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:38:51 crc kubenswrapper[4907]: I1009 20:38:51.133536 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhtfh"] Oct 09 20:38:51 crc kubenswrapper[4907]: W1009 20:38:51.167750 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf7edd0_787d_4bf1_abcc_a6485d007fb6.slice/crio-d3834c5662383fe68977c03be8e34816b4571dce9fc52b7bed64cdba3d90230d WatchSource:0}: Error finding container d3834c5662383fe68977c03be8e34816b4571dce9fc52b7bed64cdba3d90230d: Status 404 returned error can't find the container with id d3834c5662383fe68977c03be8e34816b4571dce9fc52b7bed64cdba3d90230d Oct 09 20:38:51 crc kubenswrapper[4907]: I1009 20:38:51.831769 4907 generic.go:334] "Generic (PLEG): container finished" podID="fdf7edd0-787d-4bf1-abcc-a6485d007fb6" containerID="ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b" exitCode=0 Oct 09 20:38:51 crc kubenswrapper[4907]: I1009 20:38:51.832096 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhtfh" event={"ID":"fdf7edd0-787d-4bf1-abcc-a6485d007fb6","Type":"ContainerDied","Data":"ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b"} Oct 09 20:38:51 crc kubenswrapper[4907]: I1009 20:38:51.832123 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhtfh" event={"ID":"fdf7edd0-787d-4bf1-abcc-a6485d007fb6","Type":"ContainerStarted","Data":"d3834c5662383fe68977c03be8e34816b4571dce9fc52b7bed64cdba3d90230d"} Oct 09 20:38:52 crc kubenswrapper[4907]: I1009 20:38:52.845007 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhtfh" event={"ID":"fdf7edd0-787d-4bf1-abcc-a6485d007fb6","Type":"ContainerStarted","Data":"8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d"} Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.655057 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ld4r"] Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.658062 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.670036 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ld4r"] Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.763150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-utilities\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.763439 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-catalog-content\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.763515 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx9h6\" (UniqueName: \"kubernetes.io/projected/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-kube-api-access-cx9h6\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.866817 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx9h6\" (UniqueName: \"kubernetes.io/projected/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-kube-api-access-cx9h6\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.867051 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-utilities\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.867123 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-catalog-content\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.867997 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-catalog-content\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.868880 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-utilities\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.870469 4907 generic.go:334] "Generic (PLEG): container finished" podID="fdf7edd0-787d-4bf1-abcc-a6485d007fb6" containerID="8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d" exitCode=0 Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.870520 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhtfh" event={"ID":"fdf7edd0-787d-4bf1-abcc-a6485d007fb6","Type":"ContainerDied","Data":"8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d"} Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.891245 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx9h6\" (UniqueName: \"kubernetes.io/projected/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-kube-api-access-cx9h6\") pod \"community-operators-8ld4r\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:55 crc kubenswrapper[4907]: I1009 20:38:55.975372 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:38:56 crc kubenswrapper[4907]: I1009 20:38:56.548571 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ld4r"] Oct 09 20:38:56 crc kubenswrapper[4907]: W1009 20:38:56.555832 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb7e5cfd_b1f9_498e_84f9_be96851f1cc9.slice/crio-b4dfda4970c624c7ded6d8d82a480b98a5aa697f2f9c5cebe0c6a6e97ead7e61 WatchSource:0}: Error finding container b4dfda4970c624c7ded6d8d82a480b98a5aa697f2f9c5cebe0c6a6e97ead7e61: Status 404 returned error can't find the container with id b4dfda4970c624c7ded6d8d82a480b98a5aa697f2f9c5cebe0c6a6e97ead7e61 Oct 09 20:38:56 crc kubenswrapper[4907]: I1009 20:38:56.882073 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhtfh" event={"ID":"fdf7edd0-787d-4bf1-abcc-a6485d007fb6","Type":"ContainerStarted","Data":"830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5"} Oct 09 20:38:56 crc kubenswrapper[4907]: I1009 20:38:56.885357 4907 generic.go:334] "Generic (PLEG): container finished" podID="db7e5cfd-b1f9-498e-84f9-be96851f1cc9" containerID="52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5" exitCode=0 Oct 09 20:38:56 crc kubenswrapper[4907]: I1009 20:38:56.885393 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ld4r" event={"ID":"db7e5cfd-b1f9-498e-84f9-be96851f1cc9","Type":"ContainerDied","Data":"52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5"} Oct 09 20:38:56 crc kubenswrapper[4907]: I1009 20:38:56.885414 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ld4r" event={"ID":"db7e5cfd-b1f9-498e-84f9-be96851f1cc9","Type":"ContainerStarted","Data":"b4dfda4970c624c7ded6d8d82a480b98a5aa697f2f9c5cebe0c6a6e97ead7e61"} Oct 09 20:38:56 crc kubenswrapper[4907]: I1009 20:38:56.908975 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vhtfh" podStartSLOduration=2.412790866 podStartE2EDuration="6.908954373s" podCreationTimestamp="2025-10-09 20:38:50 +0000 UTC" firstStartedPulling="2025-10-09 20:38:51.834232413 +0000 UTC m=+4217.366199902" lastFinishedPulling="2025-10-09 20:38:56.33039592 +0000 UTC m=+4221.862363409" observedRunningTime="2025-10-09 20:38:56.899432105 +0000 UTC m=+4222.431399594" watchObservedRunningTime="2025-10-09 20:38:56.908954373 +0000 UTC m=+4222.440921862" Oct 09 20:38:58 crc kubenswrapper[4907]: I1009 20:38:58.909547 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ld4r" event={"ID":"db7e5cfd-b1f9-498e-84f9-be96851f1cc9","Type":"ContainerStarted","Data":"0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1"} Oct 09 20:39:00 crc kubenswrapper[4907]: I1009 20:39:00.594210 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:39:00 crc kubenswrapper[4907]: I1009 20:39:00.594584 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:39:01 crc kubenswrapper[4907]: I1009 20:39:01.650916 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vhtfh" podUID="fdf7edd0-787d-4bf1-abcc-a6485d007fb6" containerName="registry-server" probeResult="failure" output=< Oct 09 20:39:01 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Oct 09 20:39:01 crc kubenswrapper[4907]: > Oct 09 20:39:02 crc kubenswrapper[4907]: I1009 20:39:02.949325 4907 generic.go:334] "Generic (PLEG): container finished" podID="db7e5cfd-b1f9-498e-84f9-be96851f1cc9" containerID="0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1" exitCode=0 Oct 09 20:39:02 crc kubenswrapper[4907]: I1009 20:39:02.949421 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ld4r" event={"ID":"db7e5cfd-b1f9-498e-84f9-be96851f1cc9","Type":"ContainerDied","Data":"0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1"} Oct 09 20:39:03 crc kubenswrapper[4907]: I1009 20:39:03.961607 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ld4r" event={"ID":"db7e5cfd-b1f9-498e-84f9-be96851f1cc9","Type":"ContainerStarted","Data":"deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5"} Oct 09 20:39:03 crc kubenswrapper[4907]: I1009 20:39:03.983751 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ld4r" podStartSLOduration=2.512787397 podStartE2EDuration="8.983729949s" podCreationTimestamp="2025-10-09 20:38:55 +0000 UTC" firstStartedPulling="2025-10-09 20:38:56.886752009 +0000 UTC m=+4222.418719498" lastFinishedPulling="2025-10-09 20:39:03.357694561 +0000 UTC m=+4228.889662050" observedRunningTime="2025-10-09 20:39:03.98135209 +0000 UTC m=+4229.513319589" watchObservedRunningTime="2025-10-09 20:39:03.983729949 +0000 UTC m=+4229.515697448" Oct 09 20:39:06 crc kubenswrapper[4907]: I1009 20:39:05.987416 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:39:06 crc kubenswrapper[4907]: I1009 20:39:06.006171 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:39:06 crc kubenswrapper[4907]: I1009 20:39:06.057635 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:39:10 crc kubenswrapper[4907]: I1009 20:39:10.647860 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:39:10 crc kubenswrapper[4907]: I1009 20:39:10.718522 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:39:10 crc kubenswrapper[4907]: I1009 20:39:10.884509 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhtfh"] Oct 09 20:39:12 crc kubenswrapper[4907]: I1009 20:39:12.104000 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vhtfh" podUID="fdf7edd0-787d-4bf1-abcc-a6485d007fb6" containerName="registry-server" containerID="cri-o://830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5" gracePeriod=2 Oct 09 20:39:12 crc kubenswrapper[4907]: I1009 20:39:12.782627 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:39:12 crc kubenswrapper[4907]: I1009 20:39:12.954411 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-utilities\") pod \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " Oct 09 20:39:12 crc kubenswrapper[4907]: I1009 20:39:12.954991 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-catalog-content\") pod \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " Oct 09 20:39:12 crc kubenswrapper[4907]: I1009 20:39:12.955160 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgx2l\" (UniqueName: \"kubernetes.io/projected/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-kube-api-access-kgx2l\") pod \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\" (UID: \"fdf7edd0-787d-4bf1-abcc-a6485d007fb6\") " Oct 09 20:39:12 crc kubenswrapper[4907]: I1009 20:39:12.955448 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-utilities" (OuterVolumeSpecName: "utilities") pod "fdf7edd0-787d-4bf1-abcc-a6485d007fb6" (UID: "fdf7edd0-787d-4bf1-abcc-a6485d007fb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:39:12 crc kubenswrapper[4907]: I1009 20:39:12.956111 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:39:12 crc kubenswrapper[4907]: I1009 20:39:12.961380 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-kube-api-access-kgx2l" (OuterVolumeSpecName: "kube-api-access-kgx2l") pod "fdf7edd0-787d-4bf1-abcc-a6485d007fb6" (UID: "fdf7edd0-787d-4bf1-abcc-a6485d007fb6"). InnerVolumeSpecName "kube-api-access-kgx2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.027693 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdf7edd0-787d-4bf1-abcc-a6485d007fb6" (UID: "fdf7edd0-787d-4bf1-abcc-a6485d007fb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.058012 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.058052 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgx2l\" (UniqueName: \"kubernetes.io/projected/fdf7edd0-787d-4bf1-abcc-a6485d007fb6-kube-api-access-kgx2l\") on node \"crc\" DevicePath \"\"" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.115121 4907 generic.go:334] "Generic (PLEG): container finished" podID="fdf7edd0-787d-4bf1-abcc-a6485d007fb6" containerID="830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5" exitCode=0 Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.115168 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhtfh" event={"ID":"fdf7edd0-787d-4bf1-abcc-a6485d007fb6","Type":"ContainerDied","Data":"830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5"} Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.115214 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhtfh" event={"ID":"fdf7edd0-787d-4bf1-abcc-a6485d007fb6","Type":"ContainerDied","Data":"d3834c5662383fe68977c03be8e34816b4571dce9fc52b7bed64cdba3d90230d"} Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.115232 4907 scope.go:117] "RemoveContainer" containerID="830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.115241 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhtfh" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.140798 4907 scope.go:117] "RemoveContainer" containerID="8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.173570 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhtfh"] Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.173628 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vhtfh"] Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.174069 4907 scope.go:117] "RemoveContainer" containerID="ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.217387 4907 scope.go:117] "RemoveContainer" containerID="830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5" Oct 09 20:39:13 crc kubenswrapper[4907]: E1009 20:39:13.217985 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5\": container with ID starting with 830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5 not found: ID does not exist" containerID="830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.218038 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5"} err="failed to get container status \"830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5\": rpc error: code = NotFound desc = could not find container \"830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5\": container with ID starting with 830551cb579b97c95519ba1cdab4eaa8ca04ef7653875873dbc8f7127d7e21b5 not found: ID does not exist" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.218070 4907 scope.go:117] "RemoveContainer" containerID="8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d" Oct 09 20:39:13 crc kubenswrapper[4907]: E1009 20:39:13.218523 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d\": container with ID starting with 8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d not found: ID does not exist" containerID="8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.218550 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d"} err="failed to get container status \"8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d\": rpc error: code = NotFound desc = could not find container \"8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d\": container with ID starting with 8f598fec60d90b6b8d876d07812cec27cc6dd2d7b41af2857f0d89386ff5042d not found: ID does not exist" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.218572 4907 scope.go:117] "RemoveContainer" containerID="ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b" Oct 09 20:39:13 crc kubenswrapper[4907]: E1009 20:39:13.218843 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b\": container with ID starting with ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b not found: ID does not exist" containerID="ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b" Oct 09 20:39:13 crc kubenswrapper[4907]: I1009 20:39:13.218868 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b"} err="failed to get container status \"ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b\": rpc error: code = NotFound desc = could not find container \"ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b\": container with ID starting with ab9eceff9297f80ee70b37da2a96f59a852bceef53c73efb515ad2e561f90e6b not found: ID does not exist" Oct 09 20:39:15 crc kubenswrapper[4907]: I1009 20:39:15.164107 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf7edd0-787d-4bf1-abcc-a6485d007fb6" path="/var/lib/kubelet/pods/fdf7edd0-787d-4bf1-abcc-a6485d007fb6/volumes" Oct 09 20:39:16 crc kubenswrapper[4907]: I1009 20:39:16.062078 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:39:16 crc kubenswrapper[4907]: I1009 20:39:16.291621 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ld4r"] Oct 09 20:39:16 crc kubenswrapper[4907]: I1009 20:39:16.291882 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ld4r" podUID="db7e5cfd-b1f9-498e-84f9-be96851f1cc9" containerName="registry-server" containerID="cri-o://deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5" gracePeriod=2 Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.001835 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.145985 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-utilities\") pod \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.146345 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-catalog-content\") pod \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.146598 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx9h6\" (UniqueName: \"kubernetes.io/projected/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-kube-api-access-cx9h6\") pod \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\" (UID: \"db7e5cfd-b1f9-498e-84f9-be96851f1cc9\") " Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.146974 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-utilities" (OuterVolumeSpecName: "utilities") pod "db7e5cfd-b1f9-498e-84f9-be96851f1cc9" (UID: "db7e5cfd-b1f9-498e-84f9-be96851f1cc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.147237 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.156838 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-kube-api-access-cx9h6" (OuterVolumeSpecName: "kube-api-access-cx9h6") pod "db7e5cfd-b1f9-498e-84f9-be96851f1cc9" (UID: "db7e5cfd-b1f9-498e-84f9-be96851f1cc9"). InnerVolumeSpecName "kube-api-access-cx9h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.158647 4907 generic.go:334] "Generic (PLEG): container finished" podID="db7e5cfd-b1f9-498e-84f9-be96851f1cc9" containerID="deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5" exitCode=0 Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.158721 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ld4r" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.197021 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db7e5cfd-b1f9-498e-84f9-be96851f1cc9" (UID: "db7e5cfd-b1f9-498e-84f9-be96851f1cc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.205795 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ld4r" event={"ID":"db7e5cfd-b1f9-498e-84f9-be96851f1cc9","Type":"ContainerDied","Data":"deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5"} Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.205872 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ld4r" event={"ID":"db7e5cfd-b1f9-498e-84f9-be96851f1cc9","Type":"ContainerDied","Data":"b4dfda4970c624c7ded6d8d82a480b98a5aa697f2f9c5cebe0c6a6e97ead7e61"} Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.205895 4907 scope.go:117] "RemoveContainer" containerID="deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.230695 4907 scope.go:117] "RemoveContainer" containerID="0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.248907 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx9h6\" (UniqueName: \"kubernetes.io/projected/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-kube-api-access-cx9h6\") on node \"crc\" DevicePath \"\"" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.248942 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7e5cfd-b1f9-498e-84f9-be96851f1cc9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.257111 4907 scope.go:117] "RemoveContainer" containerID="52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.298864 4907 scope.go:117] "RemoveContainer" containerID="deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5" Oct 09 20:39:17 crc kubenswrapper[4907]: E1009 20:39:17.299820 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5\": container with ID starting with deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5 not found: ID does not exist" containerID="deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.300085 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5"} err="failed to get container status \"deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5\": rpc error: code = NotFound desc = could not find container \"deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5\": container with ID starting with deefa6508f20786807a8a2ea60eb98c616b4c9b2189998c1cb9e87ad1d8098f5 not found: ID does not exist" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.300109 4907 scope.go:117] "RemoveContainer" containerID="0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1" Oct 09 20:39:17 crc kubenswrapper[4907]: E1009 20:39:17.300402 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1\": container with ID starting with 0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1 not found: ID does not exist" containerID="0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.300424 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1"} err="failed to get container status \"0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1\": rpc error: code = NotFound desc = could not find container \"0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1\": container with ID starting with 0c8b09075f14abe4701b4c739420cff70455b559ad02d1618ba064cdcb5b53a1 not found: ID does not exist" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.300438 4907 scope.go:117] "RemoveContainer" containerID="52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5" Oct 09 20:39:17 crc kubenswrapper[4907]: E1009 20:39:17.300733 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5\": container with ID starting with 52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5 not found: ID does not exist" containerID="52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.300767 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5"} err="failed to get container status \"52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5\": rpc error: code = NotFound desc = could not find container \"52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5\": container with ID starting with 52f86c6ab954bf7ea5b8a10cf1520b4aa1958797de3a294f82f204261b6310d5 not found: ID does not exist" Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.512672 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ld4r"] Oct 09 20:39:17 crc kubenswrapper[4907]: I1009 20:39:17.523355 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ld4r"] Oct 09 20:39:19 crc kubenswrapper[4907]: I1009 20:39:19.166976 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7e5cfd-b1f9-498e-84f9-be96851f1cc9" path="/var/lib/kubelet/pods/db7e5cfd-b1f9-498e-84f9-be96851f1cc9/volumes" Oct 09 20:39:36 crc kubenswrapper[4907]: I1009 20:39:36.300151 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:39:36 crc kubenswrapper[4907]: I1009 20:39:36.300730 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 20:40:06 crc kubenswrapper[4907]: I1009 20:40:06.299934 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v2wbt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 20:40:06 crc kubenswrapper[4907]: I1009 20:40:06.300549 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v2wbt" podUID="717141fe-c68d-4844-ad99-872d296a6370" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"